Feb 1 01:38:27 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Feb 1 01:38:27 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Feb 1 01:38:27 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 1 01:38:27 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 1 01:38:27 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 1 01:38:27 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 1 01:38:27 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 1 01:38:27 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Feb 1 01:38:27 localhost kernel: signal: max sigframe size: 1776 Feb 1 01:38:27 localhost kernel: BIOS-provided physical RAM map: Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Feb 1 01:38:27 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Feb 1 01:38:27 localhost kernel: NX (Execute Disable) protection: active Feb 1 01:38:27 localhost kernel: SMBIOS 2.8 present. Feb 1 01:38:27 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Feb 1 01:38:27 localhost kernel: Hypervisor detected: KVM Feb 1 01:38:27 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Feb 1 01:38:27 localhost kernel: kvm-clock: using sched offset of 2368856074 cycles Feb 1 01:38:27 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Feb 1 01:38:27 localhost kernel: tsc: Detected 2799.998 MHz processor Feb 1 01:38:27 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Feb 1 01:38:27 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 1 01:38:27 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Feb 1 01:38:27 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Feb 1 01:38:27 localhost kernel: Using GB pages for direct mapping Feb 1 01:38:27 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Feb 1 01:38:27 localhost kernel: ACPI: Early table checksum verification disabled Feb 1 01:38:27 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Feb 1 01:38:27 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 1 01:38:27 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 1 01:38:27 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 1 01:38:27 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Feb 1 01:38:27 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 1 01:38:27 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 1 01:38:27 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Feb 1 01:38:27 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Feb 1 01:38:27 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Feb 1 01:38:27 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Feb 1 01:38:27 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Feb 1 01:38:27 localhost kernel: No NUMA configuration found Feb 1 01:38:27 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Feb 1 01:38:27 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Feb 1 01:38:27 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Feb 1 01:38:27 localhost kernel: Zone ranges: Feb 1 01:38:27 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 1 01:38:27 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 1 01:38:27 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Feb 1 01:38:27 localhost kernel: Device empty Feb 1 01:38:27 localhost kernel: Movable zone start for each node Feb 1 01:38:27 localhost kernel: Early memory node ranges Feb 1 01:38:27 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Feb 1 01:38:27 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Feb 1 01:38:27 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Feb 1 01:38:27 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Feb 1 01:38:27 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 1 01:38:27 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Feb 1 01:38:27 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Feb 1 01:38:27 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Feb 1 01:38:27 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Feb 1 01:38:27 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Feb 1 01:38:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 1 01:38:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Feb 1 01:38:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 1 01:38:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Feb 1 01:38:27 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Feb 1 01:38:27 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 1 01:38:27 localhost kernel: TSC deadline timer available Feb 1 01:38:27 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Feb 1 01:38:27 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Feb 1 01:38:27 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Feb 1 01:38:27 localhost kernel: Booting paravirtualized kernel on KVM Feb 1 01:38:27 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 1 01:38:27 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Feb 1 01:38:27 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Feb 1 01:38:27 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Feb 1 01:38:27 localhost kernel: Fallback order for Node 0: 0 Feb 1 01:38:27 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Feb 1 01:38:27 localhost kernel: Policy zone: Normal Feb 1 01:38:27 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 1 01:38:27 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Feb 1 01:38:27 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 1 01:38:27 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Feb 1 01:38:27 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 1 01:38:27 localhost kernel: software IO TLB: area num 8. Feb 1 01:38:27 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Feb 1 01:38:27 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Feb 1 01:38:27 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Feb 1 01:38:27 localhost kernel: ftrace: allocating 44803 entries in 176 pages Feb 1 01:38:27 localhost kernel: ftrace: allocated 176 pages with 3 groups Feb 1 01:38:27 localhost kernel: Dynamic Preempt: voluntary Feb 1 01:38:27 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Feb 1 01:38:27 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Feb 1 01:38:27 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Feb 1 01:38:27 localhost kernel: #011Rude variant of Tasks RCU enabled. Feb 1 01:38:27 localhost kernel: #011Tracing variant of Tasks RCU enabled. Feb 1 01:38:27 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 1 01:38:27 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Feb 1 01:38:27 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Feb 1 01:38:27 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Feb 1 01:38:27 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Feb 1 01:38:27 localhost kernel: random: crng init done (trusting CPU's manufacturer) Feb 1 01:38:27 localhost kernel: Console: colour VGA+ 80x25 Feb 1 01:38:27 localhost kernel: printk: console [tty0] enabled Feb 1 01:38:27 localhost kernel: printk: console [ttyS0] enabled Feb 1 01:38:27 localhost kernel: ACPI: Core revision 20211217 Feb 1 01:38:27 localhost kernel: APIC: Switch to symmetric I/O mode setup Feb 1 01:38:27 localhost kernel: x2apic enabled Feb 1 01:38:27 localhost kernel: Switched APIC routing to physical x2apic. Feb 1 01:38:27 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Feb 1 01:38:27 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Feb 1 01:38:27 localhost kernel: pid_max: default: 32768 minimum: 301 Feb 1 01:38:27 localhost kernel: LSM: Security Framework initializing Feb 1 01:38:27 localhost kernel: Yama: becoming mindful. Feb 1 01:38:27 localhost kernel: SELinux: Initializing. Feb 1 01:38:27 localhost kernel: LSM support for eBPF active Feb 1 01:38:27 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 1 01:38:27 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 1 01:38:27 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Feb 1 01:38:27 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Feb 1 01:38:27 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Feb 1 01:38:27 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 1 01:38:27 localhost kernel: Spectre V2 : Mitigation: Retpolines Feb 1 01:38:27 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 1 01:38:27 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Feb 1 01:38:27 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Feb 1 01:38:27 localhost kernel: RETBleed: Mitigation: untrained return thunk Feb 1 01:38:27 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 1 01:38:27 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 1 01:38:27 localhost kernel: Freeing SMP alternatives memory: 36K Feb 1 01:38:27 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Feb 1 01:38:27 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Feb 1 01:38:27 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 1 01:38:27 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 1 01:38:27 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 1 01:38:27 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Feb 1 01:38:27 localhost kernel: ... version: 0 Feb 1 01:38:27 localhost kernel: ... bit width: 48 Feb 1 01:38:27 localhost kernel: ... generic registers: 6 Feb 1 01:38:27 localhost kernel: ... value mask: 0000ffffffffffff Feb 1 01:38:27 localhost kernel: ... max period: 00007fffffffffff Feb 1 01:38:27 localhost kernel: ... fixed-purpose events: 0 Feb 1 01:38:27 localhost kernel: ... event mask: 000000000000003f Feb 1 01:38:27 localhost kernel: rcu: Hierarchical SRCU implementation. Feb 1 01:38:27 localhost kernel: rcu: #011Max phase no-delay instances is 400. Feb 1 01:38:27 localhost kernel: smp: Bringing up secondary CPUs ... Feb 1 01:38:27 localhost kernel: x86: Booting SMP configuration: Feb 1 01:38:27 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Feb 1 01:38:27 localhost kernel: smp: Brought up 1 node, 8 CPUs Feb 1 01:38:27 localhost kernel: smpboot: Max logical packages: 8 Feb 1 01:38:27 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Feb 1 01:38:27 localhost kernel: node 0 deferred pages initialised in 22ms Feb 1 01:38:27 localhost kernel: devtmpfs: initialized Feb 1 01:38:27 localhost kernel: x86/mm: Memory block size: 128MB Feb 1 01:38:27 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 1 01:38:27 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Feb 1 01:38:27 localhost kernel: pinctrl core: initialized pinctrl subsystem Feb 1 01:38:27 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 1 01:38:27 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Feb 1 01:38:27 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Feb 1 01:38:27 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Feb 1 01:38:27 localhost kernel: audit: initializing netlink subsys (disabled) Feb 1 01:38:27 localhost kernel: audit: type=2000 audit(1769927906.216:1): state=initialized audit_enabled=0 res=1 Feb 1 01:38:27 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Feb 1 01:38:27 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 1 01:38:27 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Feb 1 01:38:27 localhost kernel: cpuidle: using governor menu Feb 1 01:38:27 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Feb 1 01:38:27 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 1 01:38:27 localhost kernel: PCI: Using configuration type 1 for base access Feb 1 01:38:27 localhost kernel: PCI: Using configuration type 1 for extended access Feb 1 01:38:27 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 1 01:38:27 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Feb 1 01:38:27 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 1 01:38:27 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 1 01:38:27 localhost kernel: cryptd: max_cpu_qlen set to 1000 Feb 1 01:38:27 localhost kernel: ACPI: Added _OSI(Module Device) Feb 1 01:38:27 localhost kernel: ACPI: Added _OSI(Processor Device) Feb 1 01:38:27 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 1 01:38:27 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 1 01:38:27 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 1 01:38:27 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 1 01:38:27 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 1 01:38:27 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 1 01:38:27 localhost kernel: ACPI: Interpreter enabled Feb 1 01:38:27 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Feb 1 01:38:27 localhost kernel: ACPI: Using IOAPIC for interrupt routing Feb 1 01:38:27 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 1 01:38:27 localhost kernel: PCI: Using E820 reservations for host bridge windows Feb 1 01:38:27 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Feb 1 01:38:27 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Feb 1 01:38:27 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Feb 1 01:38:27 localhost kernel: acpiphp: Slot [3] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [4] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [5] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [6] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [7] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [8] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [9] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [10] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [11] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [12] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [13] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [14] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [15] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [16] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [17] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [18] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [19] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [20] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [21] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [22] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [23] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [24] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [25] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [26] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [27] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [28] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [29] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [30] registered Feb 1 01:38:27 localhost kernel: acpiphp: Slot [31] registered Feb 1 01:38:27 localhost kernel: PCI host bridge to bus 0000:00 Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Feb 1 01:38:27 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Feb 1 01:38:27 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Feb 1 01:38:27 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Feb 1 01:38:27 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Feb 1 01:38:27 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 1 01:38:27 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 1 01:38:27 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 1 01:38:27 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 1 01:38:27 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Feb 1 01:38:27 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Feb 1 01:38:27 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Feb 1 01:38:27 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Feb 1 01:38:27 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 1 01:38:27 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Feb 1 01:38:27 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Feb 1 01:38:27 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Feb 1 01:38:27 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Feb 1 01:38:27 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Feb 1 01:38:27 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Feb 1 01:38:27 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Feb 1 01:38:27 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Feb 1 01:38:27 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Feb 1 01:38:27 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Feb 1 01:38:27 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Feb 1 01:38:27 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Feb 1 01:38:27 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Feb 1 01:38:27 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Feb 1 01:38:27 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Feb 1 01:38:27 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Feb 1 01:38:27 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Feb 1 01:38:27 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 1 01:38:27 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Feb 1 01:38:27 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Feb 1 01:38:27 localhost kernel: iommu: Default domain type: Translated Feb 1 01:38:27 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 1 01:38:27 localhost kernel: SCSI subsystem initialized Feb 1 01:38:27 localhost kernel: ACPI: bus type USB registered Feb 1 01:38:27 localhost kernel: usbcore: registered new interface driver usbfs Feb 1 01:38:27 localhost kernel: usbcore: registered new interface driver hub Feb 1 01:38:27 localhost kernel: usbcore: registered new device driver usb Feb 1 01:38:27 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Feb 1 01:38:27 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 1 01:38:27 localhost kernel: PTP clock support registered Feb 1 01:38:27 localhost kernel: EDAC MC: Ver: 3.0.0 Feb 1 01:38:27 localhost kernel: NetLabel: Initializing Feb 1 01:38:27 localhost kernel: NetLabel: domain hash size = 128 Feb 1 01:38:27 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Feb 1 01:38:27 localhost kernel: NetLabel: unlabeled traffic allowed by default Feb 1 01:38:27 localhost kernel: PCI: Using ACPI for IRQ routing Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Feb 1 01:38:27 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 1 01:38:27 localhost kernel: vgaarb: loaded Feb 1 01:38:27 localhost kernel: clocksource: Switched to clocksource kvm-clock Feb 1 01:38:27 localhost kernel: VFS: Disk quotas dquot_6.6.0 Feb 1 01:38:27 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 1 01:38:27 localhost kernel: pnp: PnP ACPI init Feb 1 01:38:27 localhost kernel: pnp: PnP ACPI: found 5 devices Feb 1 01:38:27 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 1 01:38:27 localhost kernel: NET: Registered PF_INET protocol family Feb 1 01:38:27 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 1 01:38:27 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Feb 1 01:38:27 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 1 01:38:27 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 1 01:38:27 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 1 01:38:27 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Feb 1 01:38:27 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Feb 1 01:38:27 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Feb 1 01:38:27 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Feb 1 01:38:27 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 1 01:38:27 localhost kernel: NET: Registered PF_XDP protocol family Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Feb 1 01:38:27 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Feb 1 01:38:27 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Feb 1 01:38:27 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 1 01:38:27 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Feb 1 01:38:27 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 26901 usecs Feb 1 01:38:27 localhost kernel: PCI: CLS 0 bytes, default 64 Feb 1 01:38:27 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 1 01:38:27 localhost kernel: Trying to unpack rootfs image as initramfs... Feb 1 01:38:27 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Feb 1 01:38:27 localhost kernel: ACPI: bus type thunderbolt registered Feb 1 01:38:27 localhost kernel: Initialise system trusted keyrings Feb 1 01:38:27 localhost kernel: Key type blacklist registered Feb 1 01:38:27 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Feb 1 01:38:27 localhost kernel: zbud: loaded Feb 1 01:38:27 localhost kernel: integrity: Platform Keyring initialized Feb 1 01:38:27 localhost kernel: NET: Registered PF_ALG protocol family Feb 1 01:38:27 localhost kernel: xor: automatically using best checksumming function avx Feb 1 01:38:27 localhost kernel: Key type asymmetric registered Feb 1 01:38:27 localhost kernel: Asymmetric key parser 'x509' registered Feb 1 01:38:27 localhost kernel: Running certificate verification selftests Feb 1 01:38:27 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Feb 1 01:38:27 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Feb 1 01:38:27 localhost kernel: io scheduler mq-deadline registered Feb 1 01:38:27 localhost kernel: io scheduler kyber registered Feb 1 01:38:27 localhost kernel: io scheduler bfq registered Feb 1 01:38:27 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Feb 1 01:38:27 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Feb 1 01:38:27 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Feb 1 01:38:27 localhost kernel: ACPI: button: Power Button [PWRF] Feb 1 01:38:27 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Feb 1 01:38:27 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Feb 1 01:38:27 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Feb 1 01:38:27 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 1 01:38:27 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 1 01:38:27 localhost kernel: Non-volatile memory driver v1.3 Feb 1 01:38:27 localhost kernel: rdac: device handler registered Feb 1 01:38:27 localhost kernel: hp_sw: device handler registered Feb 1 01:38:27 localhost kernel: emc: device handler registered Feb 1 01:38:27 localhost kernel: alua: device handler registered Feb 1 01:38:27 localhost kernel: libphy: Fixed MDIO Bus: probed Feb 1 01:38:27 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Feb 1 01:38:27 localhost kernel: ehci-pci: EHCI PCI platform driver Feb 1 01:38:27 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Feb 1 01:38:27 localhost kernel: ohci-pci: OHCI PCI platform driver Feb 1 01:38:27 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Feb 1 01:38:27 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Feb 1 01:38:27 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Feb 1 01:38:27 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Feb 1 01:38:27 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Feb 1 01:38:27 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Feb 1 01:38:27 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Feb 1 01:38:27 localhost kernel: usb usb1: Product: UHCI Host Controller Feb 1 01:38:27 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Feb 1 01:38:27 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Feb 1 01:38:27 localhost kernel: hub 1-0:1.0: USB hub found Feb 1 01:38:27 localhost kernel: hub 1-0:1.0: 2 ports detected Feb 1 01:38:27 localhost kernel: usbcore: registered new interface driver usbserial_generic Feb 1 01:38:27 localhost kernel: usbserial: USB Serial support registered for generic Feb 1 01:38:27 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Feb 1 01:38:27 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 1 01:38:27 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 1 01:38:27 localhost kernel: mousedev: PS/2 mouse device common for all mice Feb 1 01:38:27 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Feb 1 01:38:27 localhost kernel: rtc_cmos 00:04: registered as rtc0 Feb 1 01:38:27 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Feb 1 01:38:27 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-01T06:38:26 UTC (1769927906) Feb 1 01:38:27 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Feb 1 01:38:27 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Feb 1 01:38:27 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Feb 1 01:38:27 localhost kernel: usbcore: registered new interface driver usbhid Feb 1 01:38:27 localhost kernel: usbhid: USB HID core driver Feb 1 01:38:27 localhost kernel: drop_monitor: Initializing network drop monitor service Feb 1 01:38:27 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Feb 1 01:38:27 localhost kernel: Initializing XFRM netlink socket Feb 1 01:38:27 localhost kernel: NET: Registered PF_INET6 protocol family Feb 1 01:38:27 localhost kernel: Segment Routing with IPv6 Feb 1 01:38:27 localhost kernel: NET: Registered PF_PACKET protocol family Feb 1 01:38:27 localhost kernel: mpls_gso: MPLS GSO support Feb 1 01:38:27 localhost kernel: IPI shorthand broadcast: enabled Feb 1 01:38:27 localhost kernel: AVX2 version of gcm_enc/dec engaged. Feb 1 01:38:27 localhost kernel: AES CTR mode by8 optimization enabled Feb 1 01:38:27 localhost kernel: sched_clock: Marking stable (755956617, 177736433)->(1059357468, -125664418) Feb 1 01:38:27 localhost kernel: registered taskstats version 1 Feb 1 01:38:27 localhost kernel: Loading compiled-in X.509 certificates Feb 1 01:38:27 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Feb 1 01:38:27 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Feb 1 01:38:27 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Feb 1 01:38:27 localhost kernel: zswap: loaded using pool lzo/zbud Feb 1 01:38:27 localhost kernel: page_owner is disabled Feb 1 01:38:27 localhost kernel: Key type big_key registered Feb 1 01:38:27 localhost kernel: Freeing initrd memory: 74232K Feb 1 01:38:27 localhost kernel: Key type encrypted registered Feb 1 01:38:27 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Feb 1 01:38:27 localhost kernel: Loading compiled-in module X.509 certificates Feb 1 01:38:27 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Feb 1 01:38:27 localhost kernel: ima: Allocated hash algorithm: sha256 Feb 1 01:38:27 localhost kernel: ima: No architecture policies found Feb 1 01:38:27 localhost kernel: evm: Initialising EVM extended attributes: Feb 1 01:38:27 localhost kernel: evm: security.selinux Feb 1 01:38:27 localhost kernel: evm: security.SMACK64 (disabled) Feb 1 01:38:27 localhost kernel: evm: security.SMACK64EXEC (disabled) Feb 1 01:38:27 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Feb 1 01:38:27 localhost kernel: evm: security.SMACK64MMAP (disabled) Feb 1 01:38:27 localhost kernel: evm: security.apparmor (disabled) Feb 1 01:38:27 localhost kernel: evm: security.ima Feb 1 01:38:27 localhost kernel: evm: security.capability Feb 1 01:38:27 localhost kernel: evm: HMAC attrs: 0x1 Feb 1 01:38:27 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Feb 1 01:38:27 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Feb 1 01:38:27 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Feb 1 01:38:27 localhost kernel: usb 1-1: Product: QEMU USB Tablet Feb 1 01:38:27 localhost kernel: usb 1-1: Manufacturer: QEMU Feb 1 01:38:27 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Feb 1 01:38:27 localhost kernel: Freeing unused decrypted memory: 2036K Feb 1 01:38:27 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Feb 1 01:38:27 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Feb 1 01:38:27 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Feb 1 01:38:27 localhost kernel: Write protecting the kernel read-only data: 26624k Feb 1 01:38:27 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 1 01:38:27 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Feb 1 01:38:27 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Feb 1 01:38:27 localhost kernel: Run /init as init process Feb 1 01:38:27 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 1 01:38:27 localhost systemd[1]: Detected virtualization kvm. Feb 1 01:38:27 localhost systemd[1]: Detected architecture x86-64. Feb 1 01:38:27 localhost systemd[1]: Running in initrd. Feb 1 01:38:27 localhost systemd[1]: No hostname configured, using default hostname. Feb 1 01:38:27 localhost systemd[1]: Hostname set to . Feb 1 01:38:27 localhost systemd[1]: Initializing machine ID from VM UUID. Feb 1 01:38:27 localhost systemd[1]: Queued start job for default target Initrd Default Target. Feb 1 01:38:27 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Feb 1 01:38:27 localhost systemd[1]: Reached target Local Encrypted Volumes. Feb 1 01:38:27 localhost systemd[1]: Reached target Initrd /usr File System. Feb 1 01:38:27 localhost systemd[1]: Reached target Local File Systems. Feb 1 01:38:27 localhost systemd[1]: Reached target Path Units. Feb 1 01:38:27 localhost systemd[1]: Reached target Slice Units. Feb 1 01:38:27 localhost systemd[1]: Reached target Swaps. Feb 1 01:38:27 localhost systemd[1]: Reached target Timer Units. Feb 1 01:38:27 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Feb 1 01:38:27 localhost systemd[1]: Listening on Journal Socket (/dev/log). Feb 1 01:38:27 localhost systemd[1]: Listening on Journal Socket. Feb 1 01:38:27 localhost systemd[1]: Listening on udev Control Socket. Feb 1 01:38:27 localhost systemd[1]: Listening on udev Kernel Socket. Feb 1 01:38:27 localhost systemd[1]: Reached target Socket Units. Feb 1 01:38:27 localhost systemd[1]: Starting Create List of Static Device Nodes... Feb 1 01:38:27 localhost systemd[1]: Starting Journal Service... Feb 1 01:38:27 localhost systemd[1]: Starting Load Kernel Modules... Feb 1 01:38:27 localhost systemd[1]: Starting Create System Users... Feb 1 01:38:27 localhost systemd[1]: Starting Setup Virtual Console... Feb 1 01:38:27 localhost systemd[1]: Finished Create List of Static Device Nodes. Feb 1 01:38:27 localhost systemd[1]: Finished Load Kernel Modules. Feb 1 01:38:27 localhost systemd-journald[283]: Journal started Feb 1 01:38:27 localhost systemd-journald[283]: Runtime Journal (/run/log/journal/9037fad6143b4373b625f89bce657827) is 8.0M, max 314.7M, 306.7M free. Feb 1 01:38:27 localhost systemd-modules-load[284]: Module 'msr' is built in Feb 1 01:38:27 localhost systemd[1]: Started Journal Service. Feb 1 01:38:27 localhost systemd[1]: Finished Setup Virtual Console. Feb 1 01:38:27 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Feb 1 01:38:27 localhost systemd[1]: Starting dracut cmdline hook... Feb 1 01:38:27 localhost systemd[1]: Starting Apply Kernel Variables... Feb 1 01:38:27 localhost systemd-sysusers[285]: Creating group 'sgx' with GID 997. Feb 1 01:38:27 localhost systemd-sysusers[285]: Creating group 'users' with GID 100. Feb 1 01:38:27 localhost systemd-sysusers[285]: Creating group 'dbus' with GID 81. Feb 1 01:38:27 localhost systemd-sysusers[285]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Feb 1 01:38:27 localhost systemd[1]: Finished Create System Users. Feb 1 01:38:27 localhost systemd[1]: Finished Apply Kernel Variables. Feb 1 01:38:27 localhost dracut-cmdline[288]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Feb 1 01:38:27 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Feb 1 01:38:27 localhost systemd[1]: Starting Create Volatile Files and Directories... Feb 1 01:38:27 localhost dracut-cmdline[288]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 1 01:38:27 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Feb 1 01:38:27 localhost systemd[1]: Finished Create Volatile Files and Directories. Feb 1 01:38:27 localhost systemd[1]: Finished dracut cmdline hook. Feb 1 01:38:27 localhost systemd[1]: Starting dracut pre-udev hook... Feb 1 01:38:27 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 1 01:38:27 localhost kernel: device-mapper: uevent: version 1.0.3 Feb 1 01:38:27 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Feb 1 01:38:27 localhost kernel: RPC: Registered named UNIX socket transport module. Feb 1 01:38:27 localhost kernel: RPC: Registered udp transport module. Feb 1 01:38:27 localhost kernel: RPC: Registered tcp transport module. Feb 1 01:38:27 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Feb 1 01:38:27 localhost rpc.statd[407]: Version 2.5.4 starting Feb 1 01:38:27 localhost rpc.statd[407]: Initializing NSM state Feb 1 01:38:27 localhost rpc.idmapd[412]: Setting log level to 0 Feb 1 01:38:27 localhost systemd[1]: Finished dracut pre-udev hook. Feb 1 01:38:27 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 1 01:38:27 localhost systemd-udevd[425]: Using default interface naming scheme 'rhel-9.0'. Feb 1 01:38:27 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 1 01:38:27 localhost systemd[1]: Starting dracut pre-trigger hook... Feb 1 01:38:27 localhost systemd[1]: Finished dracut pre-trigger hook. Feb 1 01:38:27 localhost systemd[1]: Starting Coldplug All udev Devices... Feb 1 01:38:27 localhost systemd[1]: Finished Coldplug All udev Devices. Feb 1 01:38:27 localhost systemd[1]: Reached target System Initialization. Feb 1 01:38:27 localhost systemd[1]: Reached target Basic System. Feb 1 01:38:27 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Feb 1 01:38:27 localhost systemd[1]: Reached target Network. Feb 1 01:38:27 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Feb 1 01:38:27 localhost systemd[1]: Starting dracut initqueue hook... Feb 1 01:38:27 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Feb 1 01:38:27 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 1 01:38:27 localhost kernel: GPT:20971519 != 838860799 Feb 1 01:38:27 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Feb 1 01:38:27 localhost kernel: GPT:20971519 != 838860799 Feb 1 01:38:27 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Feb 1 01:38:27 localhost kernel: vda: vda1 vda2 vda3 vda4 Feb 1 01:38:27 localhost kernel: scsi host0: ata_piix Feb 1 01:38:27 localhost kernel: scsi host1: ata_piix Feb 1 01:38:27 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Feb 1 01:38:27 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Feb 1 01:38:27 localhost systemd-udevd[429]: Network interface NamePolicy= disabled on kernel command line. Feb 1 01:38:28 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Feb 1 01:38:28 localhost systemd[1]: Reached target Initrd Root Device. Feb 1 01:38:28 localhost kernel: ata1: found unknown device (class 0) Feb 1 01:38:28 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Feb 1 01:38:28 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Feb 1 01:38:28 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Feb 1 01:38:28 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Feb 1 01:38:28 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 1 01:38:28 localhost systemd[1]: Finished dracut initqueue hook. Feb 1 01:38:28 localhost systemd[1]: Reached target Preparation for Remote File Systems. Feb 1 01:38:28 localhost systemd[1]: Reached target Remote Encrypted Volumes. Feb 1 01:38:28 localhost systemd[1]: Reached target Remote File Systems. Feb 1 01:38:28 localhost systemd[1]: Starting dracut pre-mount hook... Feb 1 01:38:28 localhost systemd[1]: Finished dracut pre-mount hook. Feb 1 01:38:28 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Feb 1 01:38:28 localhost systemd-fsck[510]: /usr/sbin/fsck.xfs: XFS file system. Feb 1 01:38:28 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Feb 1 01:38:28 localhost systemd[1]: Mounting /sysroot... Feb 1 01:38:28 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Feb 1 01:38:28 localhost kernel: XFS (vda4): Mounting V5 Filesystem Feb 1 01:38:28 localhost kernel: XFS (vda4): Ending clean mount Feb 1 01:38:28 localhost systemd[1]: Mounted /sysroot. Feb 1 01:38:28 localhost systemd[1]: Reached target Initrd Root File System. Feb 1 01:38:28 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Feb 1 01:38:28 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 1 01:38:28 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Feb 1 01:38:28 localhost systemd[1]: Reached target Initrd File Systems. Feb 1 01:38:28 localhost systemd[1]: Reached target Initrd Default Target. Feb 1 01:38:28 localhost systemd[1]: Starting dracut mount hook... Feb 1 01:38:28 localhost systemd[1]: Finished dracut mount hook. Feb 1 01:38:28 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Feb 1 01:38:28 localhost rpc.idmapd[412]: exiting on signal 15 Feb 1 01:38:28 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Feb 1 01:38:28 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Feb 1 01:38:28 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Feb 1 01:38:28 localhost systemd[1]: Stopped target Network. Feb 1 01:38:28 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Feb 1 01:38:28 localhost systemd[1]: Stopped target Timer Units. Feb 1 01:38:28 localhost systemd[1]: dbus.socket: Deactivated successfully. Feb 1 01:38:28 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Feb 1 01:38:28 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 1 01:38:28 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Feb 1 01:38:28 localhost systemd[1]: Stopped target Initrd Default Target. Feb 1 01:38:28 localhost systemd[1]: Stopped target Basic System. Feb 1 01:38:28 localhost systemd[1]: Stopped target Initrd Root Device. Feb 1 01:38:28 localhost systemd[1]: Stopped target Initrd /usr File System. Feb 1 01:38:28 localhost systemd[1]: Stopped target Path Units. Feb 1 01:38:28 localhost systemd[1]: Stopped target Remote File Systems. Feb 1 01:38:28 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Feb 1 01:38:28 localhost systemd[1]: Stopped target Slice Units. Feb 1 01:38:28 localhost systemd[1]: Stopped target Socket Units. Feb 1 01:38:28 localhost systemd[1]: Stopped target System Initialization. Feb 1 01:38:28 localhost systemd[1]: Stopped target Local File Systems. Feb 1 01:38:28 localhost systemd[1]: Stopped target Swaps. Feb 1 01:38:28 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Feb 1 01:38:28 localhost systemd[1]: Stopped dracut mount hook. Feb 1 01:38:28 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 1 01:38:28 localhost systemd[1]: Stopped dracut pre-mount hook. Feb 1 01:38:28 localhost systemd[1]: Stopped target Local Encrypted Volumes. Feb 1 01:38:28 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Feb 1 01:38:29 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped dracut initqueue hook. Feb 1 01:38:29 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 1 01:38:29 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Load Kernel Modules. Feb 1 01:38:29 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Create Volatile Files and Directories. Feb 1 01:38:29 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Coldplug All udev Devices. Feb 1 01:38:29 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped dracut pre-trigger hook. Feb 1 01:38:29 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Feb 1 01:38:29 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Setup Virtual Console. Feb 1 01:38:29 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Feb 1 01:38:29 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Feb 1 01:38:29 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Closed udev Control Socket. Feb 1 01:38:29 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Closed udev Kernel Socket. Feb 1 01:38:29 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped dracut pre-udev hook. Feb 1 01:38:29 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped dracut cmdline hook. Feb 1 01:38:29 localhost systemd[1]: Starting Cleanup udev Database... Feb 1 01:38:29 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Feb 1 01:38:29 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Create List of Static Device Nodes. Feb 1 01:38:29 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Stopped Create System Users. Feb 1 01:38:29 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 1 01:38:29 localhost systemd[1]: Finished Cleanup udev Database. Feb 1 01:38:29 localhost systemd[1]: Reached target Switch Root. Feb 1 01:38:29 localhost systemd[1]: Starting Switch Root... Feb 1 01:38:29 localhost systemd[1]: Switching root. Feb 1 01:38:29 localhost systemd-journald[283]: Journal stopped Feb 1 01:38:30 localhost systemd-journald[283]: Received SIGTERM from PID 1 (systemd). Feb 1 01:38:30 localhost kernel: audit: type=1404 audit(1769927909.217:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Feb 1 01:38:30 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 01:38:30 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 01:38:30 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 01:38:30 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 01:38:30 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 01:38:30 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 01:38:30 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 01:38:30 localhost kernel: audit: type=1403 audit(1769927909.358:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 1 01:38:30 localhost systemd[1]: Successfully loaded SELinux policy in 147.833ms. Feb 1 01:38:30 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 33.737ms. Feb 1 01:38:30 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 1 01:38:30 localhost systemd[1]: Detected virtualization kvm. Feb 1 01:38:30 localhost systemd[1]: Detected architecture x86-64. Feb 1 01:38:30 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 01:38:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 01:38:30 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 1 01:38:30 localhost systemd[1]: Stopped Switch Root. Feb 1 01:38:30 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 1 01:38:30 localhost systemd[1]: Created slice Slice /system/getty. Feb 1 01:38:30 localhost systemd[1]: Created slice Slice /system/modprobe. Feb 1 01:38:30 localhost systemd[1]: Created slice Slice /system/serial-getty. Feb 1 01:38:30 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Feb 1 01:38:30 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Feb 1 01:38:30 localhost systemd[1]: Created slice User and Session Slice. Feb 1 01:38:30 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Feb 1 01:38:30 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Feb 1 01:38:30 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Feb 1 01:38:30 localhost systemd[1]: Reached target Local Encrypted Volumes. Feb 1 01:38:30 localhost systemd[1]: Stopped target Switch Root. Feb 1 01:38:30 localhost systemd[1]: Stopped target Initrd File Systems. Feb 1 01:38:30 localhost systemd[1]: Stopped target Initrd Root File System. Feb 1 01:38:30 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Feb 1 01:38:30 localhost systemd[1]: Reached target Path Units. Feb 1 01:38:30 localhost systemd[1]: Reached target rpc_pipefs.target. Feb 1 01:38:30 localhost systemd[1]: Reached target Slice Units. Feb 1 01:38:30 localhost systemd[1]: Reached target Swaps. Feb 1 01:38:30 localhost systemd[1]: Reached target Local Verity Protected Volumes. Feb 1 01:38:30 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Feb 1 01:38:30 localhost systemd[1]: Reached target RPC Port Mapper. Feb 1 01:38:30 localhost systemd[1]: Listening on Process Core Dump Socket. Feb 1 01:38:30 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Feb 1 01:38:30 localhost systemd[1]: Listening on udev Control Socket. Feb 1 01:38:30 localhost systemd[1]: Listening on udev Kernel Socket. Feb 1 01:38:30 localhost systemd[1]: Mounting Huge Pages File System... Feb 1 01:38:30 localhost systemd[1]: Mounting POSIX Message Queue File System... Feb 1 01:38:30 localhost systemd[1]: Mounting Kernel Debug File System... Feb 1 01:38:30 localhost systemd[1]: Mounting Kernel Trace File System... Feb 1 01:38:30 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Feb 1 01:38:30 localhost systemd[1]: Starting Create List of Static Device Nodes... Feb 1 01:38:30 localhost systemd[1]: Starting Load Kernel Module configfs... Feb 1 01:38:30 localhost systemd[1]: Starting Load Kernel Module drm... Feb 1 01:38:30 localhost systemd[1]: Starting Load Kernel Module fuse... Feb 1 01:38:30 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Feb 1 01:38:30 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 1 01:38:30 localhost systemd[1]: Stopped File System Check on Root Device. Feb 1 01:38:30 localhost systemd[1]: Stopped Journal Service. Feb 1 01:38:30 localhost systemd[1]: Starting Journal Service... Feb 1 01:38:30 localhost systemd[1]: Starting Load Kernel Modules... Feb 1 01:38:30 localhost kernel: fuse: init (API version 7.36) Feb 1 01:38:30 localhost systemd[1]: Starting Generate network units from Kernel command line... Feb 1 01:38:30 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Feb 1 01:38:30 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Feb 1 01:38:30 localhost systemd[1]: Starting Coldplug All udev Devices... Feb 1 01:38:30 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Feb 1 01:38:30 localhost systemd[1]: Mounted Huge Pages File System. Feb 1 01:38:30 localhost systemd-journald[618]: Journal started Feb 1 01:38:30 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/00836dadc27b01f9fb0a211cca69e688) is 8.0M, max 314.7M, 306.7M free. Feb 1 01:38:30 localhost systemd[1]: Queued start job for default target Multi-User System. Feb 1 01:38:30 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Feb 1 01:38:30 localhost systemd-modules-load[619]: Module 'msr' is built in Feb 1 01:38:30 localhost systemd[1]: Started Journal Service. Feb 1 01:38:30 localhost systemd[1]: Mounted POSIX Message Queue File System. Feb 1 01:38:30 localhost kernel: ACPI: bus type drm_connector registered Feb 1 01:38:30 localhost systemd[1]: Mounted Kernel Debug File System. Feb 1 01:38:30 localhost systemd[1]: Mounted Kernel Trace File System. Feb 1 01:38:30 localhost systemd[1]: Finished Create List of Static Device Nodes. Feb 1 01:38:30 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 1 01:38:30 localhost systemd[1]: Finished Load Kernel Module configfs. Feb 1 01:38:30 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 1 01:38:30 localhost systemd[1]: Finished Load Kernel Module drm. Feb 1 01:38:30 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 1 01:38:30 localhost systemd[1]: Finished Load Kernel Module fuse. Feb 1 01:38:30 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Feb 1 01:38:30 localhost systemd[1]: Finished Load Kernel Modules. Feb 1 01:38:30 localhost systemd[1]: Finished Generate network units from Kernel command line. Feb 1 01:38:30 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Feb 1 01:38:30 localhost systemd[1]: Mounting FUSE Control File System... Feb 1 01:38:30 localhost systemd[1]: Mounting Kernel Configuration File System... Feb 1 01:38:30 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Feb 1 01:38:30 localhost systemd[1]: Starting Rebuild Hardware Database... Feb 1 01:38:30 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Feb 1 01:38:30 localhost systemd[1]: Starting Load/Save Random Seed... Feb 1 01:38:30 localhost systemd[1]: Starting Apply Kernel Variables... Feb 1 01:38:30 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/00836dadc27b01f9fb0a211cca69e688) is 8.0M, max 314.7M, 306.7M free. Feb 1 01:38:30 localhost systemd-journald[618]: Received client request to flush runtime journal. Feb 1 01:38:30 localhost systemd[1]: Starting Create System Users... Feb 1 01:38:30 localhost systemd[1]: Mounted FUSE Control File System. Feb 1 01:38:30 localhost systemd[1]: Mounted Kernel Configuration File System. Feb 1 01:38:30 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Feb 1 01:38:30 localhost systemd[1]: Finished Load/Save Random Seed. Feb 1 01:38:30 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Feb 1 01:38:30 localhost systemd[1]: Finished Apply Kernel Variables. Feb 1 01:38:30 localhost systemd[1]: Finished Coldplug All udev Devices. Feb 1 01:38:30 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989. Feb 1 01:38:30 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988. Feb 1 01:38:30 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Feb 1 01:38:30 localhost systemd[1]: Finished Create System Users. Feb 1 01:38:30 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Feb 1 01:38:30 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Feb 1 01:38:30 localhost systemd[1]: Reached target Preparation for Local File Systems. Feb 1 01:38:30 localhost systemd[1]: Set up automount EFI System Partition Automount. Feb 1 01:38:30 localhost systemd[1]: Finished Rebuild Hardware Database. Feb 1 01:38:30 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 1 01:38:30 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Feb 1 01:38:30 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 1 01:38:30 localhost systemd[1]: Starting Load Kernel Module configfs... Feb 1 01:38:30 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 1 01:38:30 localhost systemd[1]: Finished Load Kernel Module configfs. Feb 1 01:38:30 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Feb 1 01:38:30 localhost systemd-udevd[644]: Network interface NamePolicy= disabled on kernel command line. Feb 1 01:38:30 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Feb 1 01:38:30 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Feb 1 01:38:30 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Feb 1 01:38:30 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Feb 1 01:38:30 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Feb 1 01:38:30 localhost systemd-fsck[678]: fsck.fat 4.2 (2021-01-31) Feb 1 01:38:30 localhost systemd-fsck[678]: /dev/vda2: 12 files, 1782/51145 clusters Feb 1 01:38:30 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Feb 1 01:38:30 localhost kernel: SVM: TSC scaling supported Feb 1 01:38:30 localhost kernel: kvm: Nested Virtualization enabled Feb 1 01:38:30 localhost kernel: SVM: kvm: Nested Paging enabled Feb 1 01:38:30 localhost kernel: SVM: LBR virtualization supported Feb 1 01:38:30 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Feb 1 01:38:30 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Feb 1 01:38:30 localhost kernel: Console: switching to colour dummy device 80x25 Feb 1 01:38:30 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Feb 1 01:38:30 localhost kernel: [drm] features: -context_init Feb 1 01:38:30 localhost kernel: [drm] number of scanouts: 1 Feb 1 01:38:30 localhost kernel: [drm] number of cap sets: 0 Feb 1 01:38:30 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Feb 1 01:38:30 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Feb 1 01:38:30 localhost kernel: Console: switching to colour frame buffer device 128x48 Feb 1 01:38:30 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Feb 1 01:38:31 localhost systemd[1]: Mounting /boot... Feb 1 01:38:31 localhost kernel: XFS (vda3): Mounting V5 Filesystem Feb 1 01:38:31 localhost kernel: XFS (vda3): Ending clean mount Feb 1 01:38:31 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Feb 1 01:38:31 localhost systemd[1]: Mounted /boot. Feb 1 01:38:31 localhost systemd[1]: Mounting /boot/efi... Feb 1 01:38:31 localhost systemd[1]: Mounted /boot/efi. Feb 1 01:38:31 localhost systemd[1]: Reached target Local File Systems. Feb 1 01:38:31 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Feb 1 01:38:31 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Feb 1 01:38:31 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 1 01:38:31 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 1 01:38:31 localhost systemd[1]: Starting Automatic Boot Loader Update... Feb 1 01:38:31 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Feb 1 01:38:31 localhost systemd[1]: Starting Create Volatile Files and Directories... Feb 1 01:38:31 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 716 (bootctl) Feb 1 01:38:31 localhost systemd[1]: Starting File System Check on /dev/vda2... Feb 1 01:38:31 localhost systemd[1]: Finished File System Check on /dev/vda2. Feb 1 01:38:31 localhost systemd[1]: Mounting EFI System Partition Automount... Feb 1 01:38:31 localhost systemd[1]: Mounted EFI System Partition Automount. Feb 1 01:38:31 localhost systemd[1]: Finished Automatic Boot Loader Update. Feb 1 01:38:31 localhost systemd[1]: Finished Create Volatile Files and Directories. Feb 1 01:38:31 localhost systemd[1]: Starting Security Auditing Service... Feb 1 01:38:31 localhost systemd[1]: Starting RPC Bind... Feb 1 01:38:31 localhost systemd[1]: Starting Rebuild Journal Catalog... Feb 1 01:38:31 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Feb 1 01:38:31 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable) Feb 1 01:38:31 localhost systemd[1]: Finished Rebuild Journal Catalog. Feb 1 01:38:31 localhost systemd[1]: Started RPC Bind. Feb 1 01:38:31 localhost augenrules[730]: /sbin/augenrules: No change Feb 1 01:38:31 localhost augenrules[740]: No rules Feb 1 01:38:31 localhost augenrules[740]: enabled 1 Feb 1 01:38:31 localhost augenrules[740]: failure 1 Feb 1 01:38:31 localhost augenrules[740]: pid 725 Feb 1 01:38:31 localhost augenrules[740]: rate_limit 0 Feb 1 01:38:31 localhost augenrules[740]: backlog_limit 8192 Feb 1 01:38:31 localhost augenrules[740]: lost 0 Feb 1 01:38:31 localhost augenrules[740]: backlog 0 Feb 1 01:38:31 localhost augenrules[740]: backlog_wait_time 60000 Feb 1 01:38:31 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 1 01:38:31 localhost augenrules[740]: enabled 1 Feb 1 01:38:31 localhost augenrules[740]: failure 1 Feb 1 01:38:31 localhost augenrules[740]: pid 725 Feb 1 01:38:31 localhost augenrules[740]: rate_limit 0 Feb 1 01:38:31 localhost augenrules[740]: backlog_limit 8192 Feb 1 01:38:31 localhost augenrules[740]: lost 0 Feb 1 01:38:31 localhost augenrules[740]: backlog 4 Feb 1 01:38:31 localhost augenrules[740]: backlog_wait_time 60000 Feb 1 01:38:31 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 1 01:38:31 localhost augenrules[740]: enabled 1 Feb 1 01:38:31 localhost augenrules[740]: failure 1 Feb 1 01:38:31 localhost augenrules[740]: pid 725 Feb 1 01:38:31 localhost augenrules[740]: rate_limit 0 Feb 1 01:38:31 localhost augenrules[740]: backlog_limit 8192 Feb 1 01:38:31 localhost augenrules[740]: lost 0 Feb 1 01:38:31 localhost augenrules[740]: backlog 3 Feb 1 01:38:31 localhost augenrules[740]: backlog_wait_time 60000 Feb 1 01:38:31 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 1 01:38:31 localhost systemd[1]: Started Security Auditing Service. Feb 1 01:38:31 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Feb 1 01:38:31 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Feb 1 01:38:31 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Feb 1 01:38:31 localhost systemd[1]: Starting Update is Completed... Feb 1 01:38:31 localhost systemd[1]: Finished Update is Completed. Feb 1 01:38:31 localhost systemd[1]: Reached target System Initialization. Feb 1 01:38:31 localhost systemd[1]: Started dnf makecache --timer. Feb 1 01:38:31 localhost systemd[1]: Started Daily rotation of log files. Feb 1 01:38:31 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Feb 1 01:38:31 localhost systemd[1]: Reached target Timer Units. Feb 1 01:38:31 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Feb 1 01:38:31 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Feb 1 01:38:31 localhost systemd[1]: Reached target Socket Units. Feb 1 01:38:31 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Feb 1 01:38:31 localhost systemd[1]: Starting D-Bus System Message Bus... Feb 1 01:38:31 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 1 01:38:31 localhost systemd[1]: Started D-Bus System Message Bus. Feb 1 01:38:31 localhost systemd[1]: Reached target Basic System. Feb 1 01:38:31 localhost systemd[1]: Starting NTP client/server... Feb 1 01:38:31 localhost journal[750]: Ready Feb 1 01:38:31 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Feb 1 01:38:31 localhost systemd[1]: Started irqbalance daemon. Feb 1 01:38:31 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Feb 1 01:38:31 localhost systemd[1]: Starting System Logging Service... Feb 1 01:38:31 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 01:38:31 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 01:38:31 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 01:38:31 localhost systemd[1]: Reached target sshd-keygen.target. Feb 1 01:38:31 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Feb 1 01:38:31 localhost systemd[1]: Reached target User and Group Name Lookups. Feb 1 01:38:31 localhost systemd[1]: Starting User Login Management... Feb 1 01:38:31 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Feb 1 01:38:31 localhost rsyslogd[758]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="758" x-info="https://www.rsyslog.com"] start Feb 1 01:38:31 localhost rsyslogd[758]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Feb 1 01:38:31 localhost systemd[1]: Started System Logging Service. Feb 1 01:38:31 localhost chronyd[765]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 1 01:38:31 localhost chronyd[765]: Using right/UTC timezone to obtain leap second data Feb 1 01:38:31 localhost chronyd[765]: Loaded seccomp filter (level 2) Feb 1 01:38:31 localhost systemd[1]: Started NTP client/server. Feb 1 01:38:31 localhost systemd-logind[759]: New seat seat0. Feb 1 01:38:31 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button) Feb 1 01:38:31 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Feb 1 01:38:31 localhost systemd[1]: Started User Login Management. Feb 1 01:38:31 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 01:38:32 localhost cloud-init[769]: Cloud-init v. 22.1-9.el9 running 'init-local' at Sun, 01 Feb 2026 06:38:32 +0000. Up 6.44 seconds. Feb 1 01:38:32 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpe5x1dt8m.mount: Deactivated successfully. Feb 1 01:38:32 localhost systemd[1]: Starting Hostname Service... Feb 1 01:38:32 localhost systemd[1]: Started Hostname Service. Feb 1 01:38:32 localhost systemd-hostnamed[783]: Hostname set to (static) Feb 1 01:38:32 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Feb 1 01:38:32 localhost systemd[1]: Reached target Preparation for Network. Feb 1 01:38:32 localhost systemd[1]: Starting Network Manager... Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8243] NetworkManager (version 1.42.2-1.el9) is starting... (boot:762ec315-c166-4d10-86ed-775359ad616f) Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8248] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8284] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Feb 1 01:38:32 localhost systemd[1]: Started Network Manager. Feb 1 01:38:32 localhost systemd[1]: Reached target Network. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8360] manager[0x55c66b3a7020]: monitoring kernel firmware directory '/lib/firmware'. Feb 1 01:38:32 localhost systemd[1]: Starting Network Manager Wait Online... Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8400] hostname: hostname: using hostnamed Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8400] hostname: static hostname changed from (none) to "np0005604212.novalocal" Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8410] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Feb 1 01:38:32 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Feb 1 01:38:32 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Feb 1 01:38:32 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 1 01:38:32 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8612] manager[0x55c66b3a7020]: rfkill: Wi-Fi hardware radio set enabled Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8612] manager[0x55c66b3a7020]: rfkill: WWAN hardware radio set enabled Feb 1 01:38:32 localhost systemd[1]: Started GSSAPI Proxy Daemon. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8675] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8677] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8685] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8686] manager: Networking is enabled by state file Feb 1 01:38:32 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8729] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8730] settings: Loaded settings plugin: keyfile (internal) Feb 1 01:38:32 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Feb 1 01:38:32 localhost systemd[1]: Reached target NFS client services. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8758] dhcp: init: Using DHCP client 'internal' Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8761] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Feb 1 01:38:32 localhost systemd[1]: Reached target Preparation for Remote File Systems. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8777] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8783] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8792] device (lo): Activation: starting connection 'lo' (c241cac5-6927-4960-bbd3-33bcbd73a371) Feb 1 01:38:32 localhost systemd[1]: Reached target Remote File Systems. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8803] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8805] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Feb 1 01:38:32 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8839] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8843] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8844] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8846] device (eth0): carrier: link connected Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8848] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8853] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8860] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8864] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8865] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8867] manager: NetworkManager state is now CONNECTING Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8870] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8877] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8881] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 1 01:38:32 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8928] dhcp4 (eth0): state changed new lease, address=38.102.83.179 Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8931] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.8949] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Feb 1 01:38:32 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9060] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9062] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9066] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9073] device (lo): Activation: successful, device activated. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9081] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9086] manager: NetworkManager state is now CONNECTED_SITE Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9090] device (eth0): Activation: successful, device activated. Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9096] manager: NetworkManager state is now CONNECTED_GLOBAL Feb 1 01:38:32 localhost NetworkManager[788]: [1769927912.9101] manager: startup complete Feb 1 01:38:32 localhost systemd[1]: Finished Network Manager Wait Online. Feb 1 01:38:32 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Feb 1 01:38:33 localhost cloud-init[941]: Cloud-init v. 22.1-9.el9 running 'init' at Sun, 01 Feb 2026 06:38:33 +0000. Up 7.35 seconds. Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | eth0 | True | 38.102.83.179 | 255.255.255.0 | global | fa:16:3e:e4:df:f1 | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | eth0 | True | fe80::f816:3eff:fee4:dff1/64 | . | link | fa:16:3e:e4:df:f1 | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | lo | True | ::1/128 | . | host | . | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | Route | Destination | Gateway | Interface | Flags | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: | 3 | multicast | :: | eth0 | U | Feb 1 01:38:33 localhost cloud-init[941]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 1 01:38:33 localhost systemd[1]: Starting Authorization Manager... Feb 1 01:38:33 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 1 01:38:33 localhost polkitd[1035]: Started polkitd version 0.117 Feb 1 01:38:33 localhost systemd[1]: Started Authorization Manager. Feb 1 01:38:36 localhost cloud-init[941]: Generating public/private rsa key pair. Feb 1 01:38:36 localhost cloud-init[941]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Feb 1 01:38:36 localhost cloud-init[941]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Feb 1 01:38:36 localhost cloud-init[941]: The key fingerprint is: Feb 1 01:38:36 localhost cloud-init[941]: SHA256:8q7PzfCor/V0+THEST4k+YOSomcyzVF//LeT/WqTnhE root@np0005604212.novalocal Feb 1 01:38:36 localhost cloud-init[941]: The key's randomart image is: Feb 1 01:38:36 localhost cloud-init[941]: +---[RSA 3072]----+ Feb 1 01:38:36 localhost cloud-init[941]: | | Feb 1 01:38:36 localhost cloud-init[941]: | . | Feb 1 01:38:36 localhost cloud-init[941]: | . o o | Feb 1 01:38:36 localhost cloud-init[941]: | . o X . | Feb 1 01:38:36 localhost cloud-init[941]: | . S o o E | Feb 1 01:38:36 localhost cloud-init[941]: | * o . + = | Feb 1 01:38:36 localhost cloud-init[941]: | + O . o +.=| Feb 1 01:38:36 localhost cloud-init[941]: | O O . .+B+| Feb 1 01:38:36 localhost cloud-init[941]: | +** = o=++| Feb 1 01:38:36 localhost cloud-init[941]: +----[SHA256]-----+ Feb 1 01:38:36 localhost cloud-init[941]: Generating public/private ecdsa key pair. Feb 1 01:38:36 localhost cloud-init[941]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Feb 1 01:38:36 localhost cloud-init[941]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Feb 1 01:38:36 localhost cloud-init[941]: The key fingerprint is: Feb 1 01:38:36 localhost cloud-init[941]: SHA256:bC9buMj9YjlyGtdde4Wp9sItDHmaZrrfVXosSxCuIsM root@np0005604212.novalocal Feb 1 01:38:36 localhost cloud-init[941]: The key's randomart image is: Feb 1 01:38:36 localhost cloud-init[941]: +---[ECDSA 256]---+ Feb 1 01:38:36 localhost cloud-init[941]: | | Feb 1 01:38:36 localhost cloud-init[941]: | | Feb 1 01:38:36 localhost cloud-init[941]: | . | Feb 1 01:38:36 localhost cloud-init[941]: | . . . o | Feb 1 01:38:36 localhost cloud-init[941]: | S .o + o| Feb 1 01:38:36 localhost cloud-init[941]: | . . +oo.+ =.| Feb 1 01:38:36 localhost cloud-init[941]: | E =.=B+.* +| Feb 1 01:38:36 localhost cloud-init[941]: | ..B**=o=+.= | Feb 1 01:38:36 localhost cloud-init[941]: | +=*Xo .oo | Feb 1 01:38:36 localhost cloud-init[941]: +----[SHA256]-----+ Feb 1 01:38:36 localhost cloud-init[941]: Generating public/private ed25519 key pair. Feb 1 01:38:36 localhost cloud-init[941]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Feb 1 01:38:36 localhost cloud-init[941]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Feb 1 01:38:36 localhost cloud-init[941]: The key fingerprint is: Feb 1 01:38:36 localhost cloud-init[941]: SHA256:OHdw6WnYobc3jmhymlu2C7C1CyWMkMOe9IN8ZAaBf94 root@np0005604212.novalocal Feb 1 01:38:36 localhost cloud-init[941]: The key's randomart image is: Feb 1 01:38:36 localhost cloud-init[941]: +--[ED25519 256]--+ Feb 1 01:38:36 localhost cloud-init[941]: |.o. | Feb 1 01:38:36 localhost cloud-init[941]: |o.. . | Feb 1 01:38:36 localhost cloud-init[941]: |++ + . + | Feb 1 01:38:36 localhost cloud-init[941]: |+oOo. . B o | Feb 1 01:38:36 localhost cloud-init[941]: | =.*+.= S B | Feb 1 01:38:36 localhost cloud-init[941]: | . o*E+ + . | Feb 1 01:38:36 localhost cloud-init[941]: | o o o . o | Feb 1 01:38:36 localhost cloud-init[941]: | ..*oo + . | Feb 1 01:38:36 localhost cloud-init[941]: | =*+.. . | Feb 1 01:38:36 localhost cloud-init[941]: +----[SHA256]-----+ Feb 1 01:38:36 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Feb 1 01:38:36 localhost systemd[1]: Reached target Cloud-config availability. Feb 1 01:38:36 localhost systemd[1]: Reached target Network is Online. Feb 1 01:38:36 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Feb 1 01:38:36 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Feb 1 01:38:36 localhost systemd[1]: Starting Crash recovery kernel arming... Feb 1 01:38:36 localhost systemd[1]: Starting Notify NFS peers of a restart... Feb 1 01:38:36 localhost systemd[1]: Starting OpenSSH server daemon... Feb 1 01:38:36 localhost systemd[1]: Starting Permit User Sessions... Feb 1 01:38:36 localhost sm-notify[1131]: Version 2.5.4 starting Feb 1 01:38:36 localhost systemd[1]: Started Notify NFS peers of a restart. Feb 1 01:38:36 localhost systemd[1]: Finished Permit User Sessions. Feb 1 01:38:36 localhost systemd[1]: Started Command Scheduler. Feb 1 01:38:36 localhost sshd[1132]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:36 localhost systemd[1]: Started Getty on tty1. Feb 1 01:38:36 localhost systemd[1]: Started Serial Getty on ttyS0. Feb 1 01:38:36 localhost systemd[1]: Reached target Login Prompts. Feb 1 01:38:36 localhost systemd[1]: Started OpenSSH server daemon. Feb 1 01:38:36 localhost systemd[1]: Reached target Multi-User System. Feb 1 01:38:36 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Feb 1 01:38:36 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 1 01:38:36 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Feb 1 01:38:36 localhost kdumpctl[1135]: kdump: No kdump initial ramdisk found. Feb 1 01:38:36 localhost kdumpctl[1135]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Feb 1 01:38:36 localhost cloud-init[1257]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Sun, 01 Feb 2026 06:38:36 +0000. Up 10.54 seconds. Feb 1 01:38:36 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Feb 1 01:38:36 localhost systemd[1]: Starting Execute cloud user/final scripts... Feb 1 01:38:36 localhost dracut[1417]: dracut-057-21.git20230214.el9 Feb 1 01:38:36 localhost cloud-init[1435]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Sun, 01 Feb 2026 06:38:36 +0000. Up 10.94 seconds. Feb 1 01:38:36 localhost dracut[1419]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Feb 1 01:38:36 localhost cloud-init[1450]: ############################################################# Feb 1 01:38:36 localhost cloud-init[1455]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Feb 1 01:38:36 localhost cloud-init[1461]: 256 SHA256:bC9buMj9YjlyGtdde4Wp9sItDHmaZrrfVXosSxCuIsM root@np0005604212.novalocal (ECDSA) Feb 1 01:38:36 localhost cloud-init[1466]: 256 SHA256:OHdw6WnYobc3jmhymlu2C7C1CyWMkMOe9IN8ZAaBf94 root@np0005604212.novalocal (ED25519) Feb 1 01:38:36 localhost cloud-init[1474]: 3072 SHA256:8q7PzfCor/V0+THEST4k+YOSomcyzVF//LeT/WqTnhE root@np0005604212.novalocal (RSA) Feb 1 01:38:36 localhost cloud-init[1478]: -----END SSH HOST KEY FINGERPRINTS----- Feb 1 01:38:36 localhost cloud-init[1483]: ############################################################# Feb 1 01:38:36 localhost cloud-init[1435]: Cloud-init v. 22.1-9.el9 finished at Sun, 01 Feb 2026 06:38:36 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 11.16 seconds Feb 1 01:38:37 localhost systemd[1]: Reloading Network Manager... Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Feb 1 01:38:37 localhost NetworkManager[788]: [1769927917.0466] audit: op="reload" arg="0" pid=1553 uid=0 result="success" Feb 1 01:38:37 localhost NetworkManager[788]: [1769927917.0474] config: signal: SIGHUP (no changes from disk) Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Feb 1 01:38:37 localhost systemd[1]: Reloaded Network Manager. Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Feb 1 01:38:37 localhost systemd[1]: Finished Execute cloud user/final scripts. Feb 1 01:38:37 localhost systemd[1]: Reached target Cloud-init target. Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Feb 1 01:38:37 localhost dracut[1419]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Feb 1 01:38:37 localhost dracut[1419]: memstrack is not available Feb 1 01:38:37 localhost dracut[1419]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Feb 1 01:38:37 localhost dracut[1419]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Feb 1 01:38:37 localhost chronyd[765]: Selected source 138.197.164.54 (2.rhel.pool.ntp.org) Feb 1 01:38:37 localhost chronyd[765]: System clock TAI offset set to 37 seconds Feb 1 01:38:37 localhost dracut[1419]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Feb 1 01:38:37 localhost dracut[1419]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Feb 1 01:38:37 localhost dracut[1419]: memstrack is not available Feb 1 01:38:37 localhost dracut[1419]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Feb 1 01:38:38 localhost dracut[1419]: *** Including module: systemd *** Feb 1 01:38:38 localhost dracut[1419]: *** Including module: systemd-initrd *** Feb 1 01:38:38 localhost dracut[1419]: *** Including module: i18n *** Feb 1 01:38:38 localhost dracut[1419]: No KEYMAP configured. Feb 1 01:38:38 localhost dracut[1419]: *** Including module: drm *** Feb 1 01:38:38 localhost dracut[1419]: *** Including module: prefixdevname *** Feb 1 01:38:38 localhost dracut[1419]: *** Including module: kernel-modules *** Feb 1 01:38:39 localhost dracut[1419]: *** Including module: kernel-modules-extra *** Feb 1 01:38:39 localhost dracut[1419]: *** Including module: qemu *** Feb 1 01:38:39 localhost dracut[1419]: *** Including module: fstab-sys *** Feb 1 01:38:39 localhost dracut[1419]: *** Including module: rootfs-block *** Feb 1 01:38:39 localhost dracut[1419]: *** Including module: terminfo *** Feb 1 01:38:39 localhost dracut[1419]: *** Including module: udev-rules *** Feb 1 01:38:40 localhost dracut[1419]: Skipping udev rule: 91-permissions.rules Feb 1 01:38:40 localhost dracut[1419]: Skipping udev rule: 80-drivers-modprobe.rules Feb 1 01:38:40 localhost dracut[1419]: *** Including module: virtiofs *** Feb 1 01:38:40 localhost dracut[1419]: *** Including module: dracut-systemd *** Feb 1 01:38:40 localhost dracut[1419]: *** Including module: usrmount *** Feb 1 01:38:40 localhost dracut[1419]: *** Including module: base *** Feb 1 01:38:40 localhost dracut[1419]: *** Including module: fs-lib *** Feb 1 01:38:40 localhost dracut[1419]: *** Including module: kdumpbase *** Feb 1 01:38:40 localhost dracut[1419]: *** Including module: microcode_ctl-fw_dir_override *** Feb 1 01:38:40 localhost dracut[1419]: microcode_ctl module: mangling fw_dir Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-2d-07" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-4e-03" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-4f-01" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-55-04" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-5e-03" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-8c-01" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Feb 1 01:38:41 localhost dracut[1419]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Feb 1 01:38:41 localhost dracut[1419]: *** Including module: shutdown *** Feb 1 01:38:41 localhost dracut[1419]: *** Including module: squash *** Feb 1 01:38:41 localhost dracut[1419]: *** Including modules done *** Feb 1 01:38:41 localhost dracut[1419]: *** Installing kernel module dependencies *** Feb 1 01:38:42 localhost dracut[1419]: *** Installing kernel module dependencies done *** Feb 1 01:38:42 localhost dracut[1419]: *** Resolving executable dependencies *** Feb 1 01:38:43 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 1 01:38:43 localhost dracut[1419]: *** Resolving executable dependencies done *** Feb 1 01:38:43 localhost dracut[1419]: *** Hardlinking files *** Feb 1 01:38:43 localhost dracut[1419]: Mode: real Feb 1 01:38:43 localhost dracut[1419]: Files: 1099 Feb 1 01:38:43 localhost dracut[1419]: Linked: 3 files Feb 1 01:38:43 localhost dracut[1419]: Compared: 0 xattrs Feb 1 01:38:43 localhost dracut[1419]: Compared: 373 files Feb 1 01:38:43 localhost dracut[1419]: Saved: 61.04 KiB Feb 1 01:38:43 localhost dracut[1419]: Duration: 0.048657 seconds Feb 1 01:38:43 localhost dracut[1419]: *** Hardlinking files done *** Feb 1 01:38:43 localhost dracut[1419]: Could not find 'strip'. Not stripping the initramfs. Feb 1 01:38:43 localhost dracut[1419]: *** Generating early-microcode cpio image *** Feb 1 01:38:43 localhost dracut[1419]: *** Constructing AuthenticAMD.bin *** Feb 1 01:38:43 localhost dracut[1419]: *** Store current command line parameters *** Feb 1 01:38:43 localhost dracut[1419]: Stored kernel commandline: Feb 1 01:38:43 localhost dracut[1419]: No dracut internal kernel commandline stored in the initramfs Feb 1 01:38:44 localhost dracut[1419]: *** Install squash loader *** Feb 1 01:38:44 localhost sshd[3956]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3958]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3960]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3962]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3964]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3966]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3968]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3970]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost sshd[3972]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:38:44 localhost dracut[1419]: *** Squashing the files inside the initramfs *** Feb 1 01:38:45 localhost dracut[1419]: *** Squashing the files inside the initramfs done *** Feb 1 01:38:45 localhost dracut[1419]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Feb 1 01:38:45 localhost dracut[1419]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Feb 1 01:38:46 localhost kdumpctl[1135]: kdump: kexec: loaded kdump kernel Feb 1 01:38:46 localhost kdumpctl[1135]: kdump: Starting kdump: [OK] Feb 1 01:38:46 localhost systemd[1]: Finished Crash recovery kernel arming. Feb 1 01:38:46 localhost systemd[1]: Startup finished in 1.213s (kernel) + 2.215s (initrd) + 17.175s (userspace) = 20.603s. Feb 1 01:39:02 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 1 01:39:07 localhost sshd[4175]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:39:07 localhost systemd[1]: Created slice User Slice of UID 1000. Feb 1 01:39:07 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Feb 1 01:39:07 localhost systemd-logind[759]: New session 1 of user zuul. Feb 1 01:39:07 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Feb 1 01:39:07 localhost systemd[1]: Starting User Manager for UID 1000... Feb 1 01:39:07 localhost systemd[4179]: Queued start job for default target Main User Target. Feb 1 01:39:07 localhost systemd[4179]: Created slice User Application Slice. Feb 1 01:39:07 localhost systemd[4179]: Started Mark boot as successful after the user session has run 2 minutes. Feb 1 01:39:07 localhost systemd[4179]: Started Daily Cleanup of User's Temporary Directories. Feb 1 01:39:07 localhost systemd[4179]: Reached target Paths. Feb 1 01:39:07 localhost systemd[4179]: Reached target Timers. Feb 1 01:39:07 localhost systemd[4179]: Starting D-Bus User Message Bus Socket... Feb 1 01:39:07 localhost systemd[4179]: Starting Create User's Volatile Files and Directories... Feb 1 01:39:07 localhost systemd[4179]: Finished Create User's Volatile Files and Directories. Feb 1 01:39:07 localhost systemd[4179]: Listening on D-Bus User Message Bus Socket. Feb 1 01:39:07 localhost systemd[4179]: Reached target Sockets. Feb 1 01:39:07 localhost systemd[4179]: Reached target Basic System. Feb 1 01:39:07 localhost systemd[4179]: Reached target Main User Target. Feb 1 01:39:07 localhost systemd[4179]: Startup finished in 120ms. Feb 1 01:39:07 localhost systemd[1]: Started User Manager for UID 1000. Feb 1 01:39:07 localhost systemd[1]: Started Session 1 of User zuul. Feb 1 01:39:07 localhost python3[4231]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 01:39:18 localhost python3[4250]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 01:39:27 localhost python3[4303]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 01:39:28 localhost python3[4333]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Feb 1 01:39:31 localhost python3[4349]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:39:32 localhost python3[4363]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:33 localhost python3[4422]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:39:34 localhost python3[4463]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769927973.42375-389-227305056364326/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa follow=False checksum=1450e921e2d17379ea725f99be2eea1fb6e75a52 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:35 localhost python3[4536]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:39:35 localhost python3[4577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769927975.0838637-488-155488517244478/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa.pub follow=False checksum=ad19e951a009809a91d74da158b058ce7df88458 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:37 localhost python3[4605]: ansible-ping Invoked with data=pong Feb 1 01:39:39 localhost python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 01:39:43 localhost python3[4672]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Feb 1 01:39:43 localhost chronyd[765]: Selected source 209.227.173.244 (2.rhel.pool.ntp.org) Feb 1 01:39:45 localhost python3[4694]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:46 localhost python3[4708]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:46 localhost python3[4722]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:47 localhost python3[4736]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:47 localhost python3[4750]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:47 localhost python3[4764]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:50 localhost python3[4780]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:51 localhost python3[4829]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:39:52 localhost python3[4872]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769927991.6301312-99-110700840243962/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:39:59 localhost python3[4900]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:39:59 localhost python3[4914]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:00 localhost python3[4928]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:00 localhost python3[4942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:00 localhost python3[4956]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:01 localhost python3[4970]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:01 localhost python3[4984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:01 localhost python3[4998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:01 localhost python3[5012]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:02 localhost python3[5026]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:02 localhost python3[5040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:02 localhost python3[5054]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:02 localhost python3[5068]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:03 localhost python3[5082]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:03 localhost python3[5096]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:03 localhost python3[5110]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:03 localhost python3[5124]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:04 localhost python3[5138]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:04 localhost python3[5152]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:04 localhost python3[5166]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:05 localhost python3[5180]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:05 localhost python3[5194]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:05 localhost python3[5208]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:05 localhost python3[5222]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:06 localhost python3[5236]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:06 localhost python3[5250]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:40:07 localhost python3[5266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Feb 1 01:40:07 localhost systemd[1]: Starting Time & Date Service... Feb 1 01:40:07 localhost systemd[1]: Started Time & Date Service. Feb 1 01:40:07 localhost systemd-timedated[5268]: Changed time zone to 'UTC' (UTC). Feb 1 01:40:08 localhost python3[5287]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:40:10 localhost python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:40:10 localhost python3[5374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769928009.8819413-490-61673796399365/source _original_basename=tmpj0frdk1g follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:40:11 localhost python3[5434]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:40:11 localhost python3[5475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769928011.3192186-579-194702495423275/source _original_basename=tmpg0fd3hri follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:40:13 localhost python3[5537]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:40:13 localhost python3[5580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769928013.3922174-726-197013970521627/source _original_basename=tmpyz11plri follow=False checksum=4533a6af5c84c28dd874a752186ca59a7a5dd951 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:40:15 localhost python3[5608]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:40:15 localhost python3[5624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:40:16 localhost python3[5674]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:40:16 localhost python3[5717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769928016.3768752-852-219156061955583/source _original_basename=tmpna2dlvj7 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:40:18 localhost python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-3a26-ae0f-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:40:19 localhost python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-3a26-ae0f-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Feb 1 01:40:21 localhost python3[5784]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:40:37 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 1 01:40:39 localhost python3[5803]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:41:19 localhost systemd[4179]: Starting Mark boot as successful... Feb 1 01:41:19 localhost systemd[4179]: Finished Mark boot as successful. Feb 1 01:41:39 localhost systemd-logind[759]: Session 1 logged out. Waiting for processes to exit. Feb 1 01:42:32 localhost systemd[1]: Unmounting EFI System Partition Automount... Feb 1 01:42:32 localhost systemd[1]: efi.mount: Deactivated successfully. Feb 1 01:42:32 localhost systemd[1]: Unmounted EFI System Partition Automount. Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Feb 1 01:43:52 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Feb 1 01:43:52 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.7807] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Feb 1 01:43:52 localhost systemd-udevd[5810]: Network interface NamePolicy= disabled on kernel command line. Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.7957] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.7998] settings: (eth1): created default wired connection 'Wired connection 1' Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8004] device (eth1): carrier: link connected Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8007] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8013] policy: auto-activating connection 'Wired connection 1' (be887e2a-bb90-3fda-8809-c6afed1ca254) Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8021] device (eth1): Activation: starting connection 'Wired connection 1' (be887e2a-bb90-3fda-8809-c6afed1ca254) Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8022] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8026] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8031] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Feb 1 01:43:52 localhost NetworkManager[788]: [1769928232.8035] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 1 01:43:53 localhost sshd[5812]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:43:53 localhost systemd-logind[759]: New session 3 of user zuul. Feb 1 01:43:53 localhost systemd[1]: Started Session 3 of User zuul. Feb 1 01:43:53 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Feb 1 01:43:53 localhost python3[5829]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-9afb-5883-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:44:07 localhost python3[5880]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:44:07 localhost python3[5923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769928246.7481887-435-119796072997018/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=18c06c140c733f4f21d51ba0d7d6851722a31885 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:44:08 localhost python3[5953]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 01:44:09 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Feb 1 01:44:09 localhost systemd[1]: Stopped Network Manager Wait Online. Feb 1 01:44:09 localhost systemd[1]: Stopping Network Manager Wait Online... Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0473] caught SIGTERM, shutting down normally. Feb 1 01:44:09 localhost systemd[1]: Stopping Network Manager... Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0680] dhcp4 (eth0): canceled DHCP transaction Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0680] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0680] dhcp4 (eth0): state changed no lease Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0684] manager: NetworkManager state is now CONNECTING Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0750] dhcp4 (eth1): canceled DHCP transaction Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0751] dhcp4 (eth1): state changed no lease Feb 1 01:44:09 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 1 01:44:09 localhost NetworkManager[788]: [1769928249.0824] exiting (success) Feb 1 01:44:09 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 1 01:44:09 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Feb 1 01:44:09 localhost systemd[1]: Stopped Network Manager. Feb 1 01:44:09 localhost systemd[1]: NetworkManager.service: Consumed 2.706s CPU time. Feb 1 01:44:09 localhost systemd[1]: Starting Network Manager... Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.1357] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:762ec315-c166-4d10-86ed-775359ad616f) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.1363] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Feb 1 01:44:09 localhost systemd[1]: Started Network Manager. Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.1390] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Feb 1 01:44:09 localhost systemd[1]: Starting Network Manager Wait Online... Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.1443] manager[0x55c423151090]: monitoring kernel firmware directory '/lib/firmware'. Feb 1 01:44:09 localhost systemd[1]: Starting Hostname Service... Feb 1 01:44:09 localhost systemd[1]: Started Hostname Service. Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2428] hostname: hostname: using hostnamed Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2429] hostname: static hostname changed from (none) to "np0005604212.novalocal" Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2441] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2451] manager[0x55c423151090]: rfkill: Wi-Fi hardware radio set enabled Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2451] manager[0x55c423151090]: rfkill: WWAN hardware radio set enabled Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2498] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2500] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2502] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2504] manager: Networking is enabled by state file Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2514] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2514] settings: Loaded settings plugin: keyfile (internal) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2584] dhcp: init: Using DHCP client 'internal' Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2590] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2601] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2610] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2623] device (lo): Activation: starting connection 'lo' (c241cac5-6927-4960-bbd3-33bcbd73a371) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2633] device (eth0): carrier: link connected Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2640] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2648] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2649] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2660] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2671] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2678] device (eth1): carrier: link connected Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2684] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2692] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (be887e2a-bb90-3fda-8809-c6afed1ca254) (indicated) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2693] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2701] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2711] device (eth1): Activation: starting connection 'Wired connection 1' (be887e2a-bb90-3fda-8809-c6afed1ca254) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2744] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2748] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2752] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2754] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2758] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2761] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2764] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2767] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2774] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2778] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2788] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2792] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2831] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2842] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2848] device (lo): Activation: successful, device activated. Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2858] dhcp4 (eth0): state changed new lease, address=38.102.83.179 Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2866] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.2990] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.3030] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.3032] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.3036] manager: NetworkManager state is now CONNECTED_SITE Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.3041] device (eth0): Activation: successful, device activated. Feb 1 01:44:09 localhost NetworkManager[5964]: [1769928249.3046] manager: NetworkManager state is now CONNECTED_GLOBAL Feb 1 01:44:09 localhost python3[6022]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-9afb-5883-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:44:19 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 1 01:44:19 localhost systemd[4179]: Created slice User Background Tasks Slice. Feb 1 01:44:19 localhost systemd[4179]: Starting Cleanup of User's Temporary Files and Directories... Feb 1 01:44:19 localhost systemd[4179]: Finished Cleanup of User's Temporary Files and Directories. Feb 1 01:44:39 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 1 01:44:54 localhost NetworkManager[5964]: [1769928294.7814] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:54 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 1 01:44:54 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 1 01:44:54 localhost NetworkManager[5964]: [1769928294.8019] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:54 localhost NetworkManager[5964]: [1769928294.8022] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Feb 1 01:44:54 localhost NetworkManager[5964]: [1769928294.8028] device (eth1): Activation: successful, device activated. Feb 1 01:44:54 localhost NetworkManager[5964]: [1769928294.8034] manager: startup complete Feb 1 01:44:54 localhost systemd[1]: Finished Network Manager Wait Online. Feb 1 01:45:04 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 1 01:45:09 localhost systemd[1]: session-3.scope: Deactivated successfully. Feb 1 01:45:09 localhost systemd[1]: session-3.scope: Consumed 1.552s CPU time. Feb 1 01:45:09 localhost systemd-logind[759]: Session 3 logged out. Waiting for processes to exit. Feb 1 01:45:09 localhost systemd-logind[759]: Removed session 3. Feb 1 01:46:23 localhost sshd[6057]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:46:23 localhost systemd-logind[759]: New session 4 of user zuul. Feb 1 01:46:23 localhost systemd[1]: Started Session 4 of User zuul. Feb 1 01:46:24 localhost python3[6108]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:46:24 localhost python3[6151]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769928383.8253381-628-170211637031227/source _original_basename=tmpbvpoe5f3 follow=False checksum=b662c6ad0fdede3f6b8f2737681b36760d23a74b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:46:29 localhost systemd[1]: session-4.scope: Deactivated successfully. Feb 1 01:46:29 localhost systemd-logind[759]: Session 4 logged out. Waiting for processes to exit. Feb 1 01:46:29 localhost systemd-logind[759]: Removed session 4. Feb 1 01:54:05 localhost sshd[6169]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:54:05 localhost systemd[1]: Starting Cleanup of Temporary Directories... Feb 1 01:54:05 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Feb 1 01:54:05 localhost systemd[1]: Finished Cleanup of Temporary Directories. Feb 1 01:54:05 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Feb 1 01:54:05 localhost systemd-logind[759]: New session 5 of user zuul. Feb 1 01:54:05 localhost systemd[1]: Started Session 5 of User zuul. Feb 1 01:54:06 localhost python3[6191]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-ef6d-83b0-0000000021a5-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:54:08 localhost python3[6210]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:54:08 localhost python3[6226]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:54:08 localhost python3[6242]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:54:08 localhost python3[6258]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:54:09 localhost python3[6274]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:54:10 localhost python3[6322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:54:11 localhost python3[6365]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769928850.6132953-662-47372208488758/source _original_basename=tmp4cfszsl9 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:54:12 localhost python3[6395]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 01:54:12 localhost systemd[1]: Reloading. Feb 1 01:54:12 localhost systemd-rc-local-generator[6413]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 01:54:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 01:54:14 localhost python3[6442]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Feb 1 01:54:15 localhost python3[6458]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:54:16 localhost python3[6476]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:54:16 localhost python3[6494]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:54:16 localhost python3[6512]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:54:17 localhost python3[6529]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-ef6d-83b0-0000000021ac-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:54:18 localhost python3[6549]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 01:54:21 localhost systemd[1]: session-5.scope: Deactivated successfully. Feb 1 01:54:21 localhost systemd[1]: session-5.scope: Consumed 3.965s CPU time. Feb 1 01:54:21 localhost systemd-logind[759]: Session 5 logged out. Waiting for processes to exit. Feb 1 01:54:21 localhost systemd-logind[759]: Removed session 5. Feb 1 01:54:48 localhost chronyd[765]: Selected source 138.197.164.54 (2.rhel.pool.ntp.org) Feb 1 01:55:17 localhost sshd[6555]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:55:18 localhost systemd-logind[759]: New session 6 of user zuul. Feb 1 01:55:18 localhost systemd[1]: Started Session 6 of User zuul. Feb 1 01:55:18 localhost systemd[1]: Starting RHSM dbus service... Feb 1 01:55:19 localhost systemd[1]: Started RHSM dbus service. Feb 1 01:55:19 localhost rhsm-service[6579]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 1 01:55:19 localhost rhsm-service[6579]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 1 01:55:19 localhost rhsm-service[6579]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 1 01:55:19 localhost rhsm-service[6579]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 1 01:55:21 localhost rhsm-service[6579]: INFO [subscription_manager.managerlib:90] Consumer created: np0005604212.novalocal (7a89b532-8c8f-4d02-bd1e-9a3ae674e86b) Feb 1 01:55:21 localhost subscription-manager[6579]: Registered system with identity: 7a89b532-8c8f-4d02-bd1e-9a3ae674e86b Feb 1 01:55:21 localhost rhsm-service[6579]: INFO [subscription_manager.entcertlib:131] certs updated: Feb 1 01:55:21 localhost rhsm-service[6579]: Total updates: 1 Feb 1 01:55:21 localhost rhsm-service[6579]: Found (local) serial# [] Feb 1 01:55:21 localhost rhsm-service[6579]: Expected (UEP) serial# [8213686551828012518] Feb 1 01:55:21 localhost rhsm-service[6579]: Added (new) Feb 1 01:55:21 localhost rhsm-service[6579]: [sn:8213686551828012518 ( Content Access,) @ /etc/pki/entitlement/8213686551828012518.pem] Feb 1 01:55:21 localhost rhsm-service[6579]: Deleted (rogue): Feb 1 01:55:21 localhost rhsm-service[6579]: Feb 1 01:55:21 localhost subscription-manager[6579]: Added subscription for 'Content Access' contract 'None' Feb 1 01:55:21 localhost subscription-manager[6579]: Added subscription for product ' Content Access' Feb 1 01:55:23 localhost rhsm-service[6579]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 1 01:55:23 localhost rhsm-service[6579]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 1 01:55:23 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 01:55:23 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 01:55:23 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 01:55:23 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 01:55:23 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 01:55:31 localhost python3[6670]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-1e29-84da-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 01:55:32 localhost python3[6689]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 01:56:03 localhost setsebool[6764]: The virt_use_nfs policy boolean was changed to 1 by root Feb 1 01:56:03 localhost setsebool[6764]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Feb 1 01:56:11 localhost kernel: SELinux: Converting 406 SID table entries... Feb 1 01:56:12 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 01:56:12 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 01:56:12 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 01:56:12 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 01:56:12 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 01:56:12 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 01:56:12 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 01:56:24 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=3 res=1 Feb 1 01:56:24 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 01:56:24 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 01:56:24 localhost systemd[1]: Reloading. Feb 1 01:56:24 localhost systemd-rc-local-generator[7582]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 01:56:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 01:56:25 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 01:56:26 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 01:56:26 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 01:56:33 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 01:56:33 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 01:56:33 localhost systemd[1]: man-db-cache-update.service: Consumed 9.869s CPU time. Feb 1 01:56:33 localhost systemd[1]: run-rb18c6e75737443d38e3bb5c78f602154.service: Deactivated successfully. Feb 1 01:57:17 localhost systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck2570924586-merged.mount: Deactivated successfully. Feb 1 01:57:17 localhost podman[18358]: 2026-02-01 06:57:17.160407613 +0000 UTC m=+0.099992099 system refresh Feb 1 01:57:17 localhost systemd[4179]: Starting D-Bus User Message Bus... Feb 1 01:57:17 localhost dbus-broker-launch[18418]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Feb 1 01:57:17 localhost dbus-broker-launch[18418]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Feb 1 01:57:17 localhost systemd[4179]: Started D-Bus User Message Bus. Feb 1 01:57:17 localhost journal[18418]: Ready Feb 1 01:57:17 localhost systemd[4179]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Feb 1 01:57:17 localhost systemd[4179]: Created slice Slice /user. Feb 1 01:57:17 localhost systemd[4179]: podman-18400.scope: unit configures an IP firewall, but not running as root. Feb 1 01:57:17 localhost systemd[4179]: (This warning is only shown for the first unit using IP firewalling.) Feb 1 01:57:17 localhost systemd[4179]: Started podman-18400.scope. Feb 1 01:57:18 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 01:57:18 localhost systemd[4179]: Started podman-pause-e15e395e.scope. Feb 1 01:57:20 localhost systemd[1]: session-6.scope: Deactivated successfully. Feb 1 01:57:20 localhost systemd[1]: session-6.scope: Consumed 51.625s CPU time. Feb 1 01:57:20 localhost systemd-logind[759]: Session 6 logged out. Waiting for processes to exit. Feb 1 01:57:20 localhost systemd-logind[759]: Removed session 6. Feb 1 01:57:34 localhost sshd[18420]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:57:34 localhost sshd[18421]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:57:34 localhost sshd[18422]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:57:34 localhost sshd[18423]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:57:34 localhost sshd[18424]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:57:39 localhost sshd[18430]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:57:39 localhost systemd-logind[759]: New session 7 of user zuul. Feb 1 01:57:39 localhost systemd[1]: Started Session 7 of User zuul. Feb 1 01:57:39 localhost python3[18447]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGBAgohDMlstWoPOrziVyT3cq7c4YoWvTNp64hcksvV2VrQsWD6YrTZBaXHL0twL/A8QbTt5cQ7NNpUOjUCI5d4= zuul@np0005604206.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:57:40 localhost python3[18463]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGBAgohDMlstWoPOrziVyT3cq7c4YoWvTNp64hcksvV2VrQsWD6YrTZBaXHL0twL/A8QbTt5cQ7NNpUOjUCI5d4= zuul@np0005604206.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:57:42 localhost systemd[1]: session-7.scope: Deactivated successfully. Feb 1 01:57:42 localhost systemd-logind[759]: Session 7 logged out. Waiting for processes to exit. Feb 1 01:57:42 localhost systemd-logind[759]: Removed session 7. Feb 1 01:59:00 localhost sshd[18465]: main: sshd: ssh-rsa algorithm is disabled Feb 1 01:59:00 localhost systemd-logind[759]: New session 8 of user zuul. Feb 1 01:59:00 localhost systemd[1]: Started Session 8 of User zuul. Feb 1 01:59:00 localhost python3[18484]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 01:59:02 localhost python3[18500]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604212.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 1 01:59:04 localhost python3[18550]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:59:04 localhost python3[18593]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769929144.0146856-131-38381072629354/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa follow=False checksum=1450e921e2d17379ea725f99be2eea1fb6e75a52 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:59:05 localhost python3[18655]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:59:06 localhost python3[18698]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769929145.6829877-221-139478315084567/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa.pub follow=False checksum=ad19e951a009809a91d74da158b058ce7df88458 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:59:08 localhost python3[18728]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:59:09 localhost python3[18774]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:59:09 localhost python3[18790]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmp65afkmvo recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:59:10 localhost python3[18850]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:59:10 localhost python3[18866]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpf2md5ucj recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:59:12 localhost python3[18926]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 01:59:12 localhost python3[18942]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpi2e4s_mk recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 01:59:13 localhost systemd[1]: session-8.scope: Deactivated successfully. Feb 1 01:59:13 localhost systemd[1]: session-8.scope: Consumed 3.396s CPU time. Feb 1 01:59:13 localhost systemd-logind[759]: Session 8 logged out. Waiting for processes to exit. Feb 1 01:59:13 localhost systemd-logind[759]: Removed session 8. Feb 1 02:01:13 localhost sshd[18973]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:01:13 localhost systemd-logind[759]: New session 9 of user zuul. Feb 1 02:01:13 localhost systemd[1]: Started Session 9 of User zuul. Feb 1 02:01:14 localhost python3[19019]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:02:07 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:02:07 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:06:13 localhost systemd[1]: session-9.scope: Deactivated successfully. Feb 1 02:06:13 localhost systemd-logind[759]: Session 9 logged out. Waiting for processes to exit. Feb 1 02:06:13 localhost systemd-logind[759]: Removed session 9. Feb 1 02:12:49 localhost sshd[19146]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:12:49 localhost systemd-logind[759]: New session 10 of user zuul. Feb 1 02:12:49 localhost systemd[1]: Started Session 10 of User zuul. Feb 1 02:12:50 localhost python3[19163]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:12:51 localhost python3[19183]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:12:56 localhost python3[19203]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Feb 1 02:12:59 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:13:52 localhost python3[19418]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Feb 1 02:13:55 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:14:02 localhost python3[19560]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Feb 1 02:14:05 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:14:05 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:14:09 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:14:09 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:14:33 localhost python3[19836]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Feb 1 02:14:36 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:14:40 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:15:01 localhost python3[20229]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Feb 1 02:15:04 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:15:04 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:15:09 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:15:09 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:15:32 localhost python3[20508]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:15:37 localhost python3[20527]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:15:51 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Feb 1 02:16:00 localhost kernel: SELinux: Converting 499 SID table entries... Feb 1 02:16:00 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 02:16:00 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 02:16:00 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 02:16:00 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 02:16:00 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 02:16:00 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 02:16:00 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 02:16:03 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=4 res=1 Feb 1 02:16:03 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:16:03 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 02:16:03 localhost systemd[1]: Reloading. Feb 1 02:16:03 localhost systemd-rc-local-generator[21169]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:16:03 localhost systemd-sysv-generator[21174]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:16:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:16:03 localhost systemd[1]: Starting dnf makecache... Feb 1 02:16:03 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 02:16:04 localhost dnf[21314]: Updating Subscription Management repositories. Feb 1 02:16:04 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 02:16:04 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 02:16:04 localhost systemd[1]: run-r45a27f16d0a145ae8c504dcf7060e8e8.service: Deactivated successfully. Feb 1 02:16:05 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:16:05 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 02:16:05 localhost dnf[21314]: Failed determining last makecache time. Feb 1 02:16:05 localhost dnf[21314]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 50 kB/s | 4.1 kB 00:00 Feb 1 02:16:06 localhost dnf[21314]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 54 kB/s | 4.5 kB 00:00 Feb 1 02:16:06 localhost dnf[21314]: Red Hat Enterprise Linux 9 for x86_64 - High Av 52 kB/s | 4.0 kB 00:00 Feb 1 02:16:06 localhost dnf[21314]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 45 kB/s | 4.1 kB 00:00 Feb 1 02:16:06 localhost dnf[21314]: Fast Datapath for RHEL 9 x86_64 (RPMs) 45 kB/s | 4.0 kB 00:00 Feb 1 02:16:06 localhost dnf[21314]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_ 41 kB/s | 4.0 kB 00:00 Feb 1 02:16:06 localhost dnf[21314]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 50 kB/s | 4.5 kB 00:00 Feb 1 02:16:07 localhost dnf[21314]: Metadata cache created. Feb 1 02:16:07 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Feb 1 02:16:07 localhost systemd[1]: Finished dnf makecache. Feb 1 02:16:07 localhost systemd[1]: dnf-makecache.service: Consumed 2.746s CPU time. Feb 1 02:16:32 localhost python3[21861]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:16:55 localhost python3[21881]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:16:56 localhost python3[21929]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:16:56 localhost python3[21972]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769930216.1065233-291-64304114970558/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:16:58 localhost python3[22002]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 1 02:16:58 localhost systemd-journald[618]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Feb 1 02:16:58 localhost systemd-journald[618]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 02:16:58 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 02:16:58 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 02:16:58 localhost python3[22023]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 1 02:16:58 localhost python3[22043]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 1 02:16:59 localhost python3[22063]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 1 02:16:59 localhost python3[22083]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 1 02:17:01 localhost python3[22103]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 02:17:01 localhost systemd[1]: Starting LSB: Bring up/down networking... Feb 1 02:17:01 localhost network[22106]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 02:17:01 localhost network[22117]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 02:17:01 localhost network[22106]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:01 localhost network[22118]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:01 localhost network[22106]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Feb 1 02:17:01 localhost network[22119]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 02:17:02 localhost NetworkManager[5964]: [1769930222.0670] audit: op="connections-reload" pid=22147 uid=0 result="success" Feb 1 02:17:02 localhost network[22106]: Bringing up loopback interface: [ OK ] Feb 1 02:17:02 localhost NetworkManager[5964]: [1769930222.2655] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22235 uid=0 result="success" Feb 1 02:17:02 localhost network[22106]: Bringing up interface eth0: [ OK ] Feb 1 02:17:02 localhost systemd[1]: Started LSB: Bring up/down networking. Feb 1 02:17:02 localhost python3[22276]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 02:17:02 localhost systemd[1]: Starting Open vSwitch Database Unit... Feb 1 02:17:02 localhost chown[22280]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Feb 1 02:17:02 localhost ovs-ctl[22285]: /etc/openvswitch/conf.db does not exist ... (warning). Feb 1 02:17:02 localhost ovs-ctl[22285]: Creating empty database /etc/openvswitch/conf.db [ OK ] Feb 1 02:17:02 localhost ovs-ctl[22285]: Starting ovsdb-server [ OK ] Feb 1 02:17:02 localhost ovs-vsctl[22334]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Feb 1 02:17:03 localhost ovs-vsctl[22354]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"e1d14e36-ae9d-43b6-8933-f137b54529ff\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Feb 1 02:17:03 localhost ovs-ctl[22285]: Configuring Open vSwitch system IDs [ OK ] Feb 1 02:17:03 localhost ovs-ctl[22285]: Enabling remote OVSDB managers [ OK ] Feb 1 02:17:03 localhost ovs-vsctl[22360]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005604212.novalocal Feb 1 02:17:03 localhost systemd[1]: Started Open vSwitch Database Unit. Feb 1 02:17:03 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Feb 1 02:17:03 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Feb 1 02:17:03 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Feb 1 02:17:03 localhost kernel: openvswitch: Open vSwitch switching datapath Feb 1 02:17:03 localhost ovs-ctl[22404]: Inserting openvswitch module [ OK ] Feb 1 02:17:03 localhost ovs-ctl[22373]: Starting ovs-vswitchd [ OK ] Feb 1 02:17:03 localhost ovs-vsctl[22423]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005604212.novalocal Feb 1 02:17:03 localhost ovs-ctl[22373]: Enabling remote OVSDB managers [ OK ] Feb 1 02:17:03 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Feb 1 02:17:03 localhost systemd[1]: Starting Open vSwitch... Feb 1 02:17:03 localhost systemd[1]: Finished Open vSwitch. Feb 1 02:17:06 localhost python3[22441]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:17:07 localhost NetworkManager[5964]: [1769930227.0406] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22637 uid=0 result="success" Feb 1 02:17:07 localhost ifup[22638]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:07 localhost ifup[22639]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:07 localhost ifup[22640]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:07 localhost NetworkManager[5964]: [1769930227.0753] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22646 uid=0 result="success" Feb 1 02:17:07 localhost ovs-vsctl[22648]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:4a:fd:db -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Feb 1 02:17:07 localhost kernel: device ovs-system entered promiscuous mode Feb 1 02:17:07 localhost NetworkManager[5964]: [1769930227.1049] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Feb 1 02:17:07 localhost kernel: Timeout policy base is empty Feb 1 02:17:07 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Feb 1 02:17:07 localhost systemd-udevd[22650]: Network interface NamePolicy= disabled on kernel command line. Feb 1 02:17:07 localhost kernel: device br-ex entered promiscuous mode Feb 1 02:17:07 localhost NetworkManager[5964]: [1769930227.1501] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Feb 1 02:17:07 localhost NetworkManager[5964]: [1769930227.1774] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22676 uid=0 result="success" Feb 1 02:17:07 localhost NetworkManager[5964]: [1769930227.1993] device (br-ex): carrier: link connected Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.2522] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22705 uid=0 result="success" Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.2935] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22720 uid=0 result="success" Feb 1 02:17:10 localhost NET[22745]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.3767] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.3969] dhcp4 (eth1): canceled DHCP transaction Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.3970] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.3970] dhcp4 (eth1): state changed no lease Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.4006] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22754 uid=0 result="success" Feb 1 02:17:10 localhost ifup[22755]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:10 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 1 02:17:10 localhost ifup[22756]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:10 localhost ifup[22758]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:10 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.4348] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22770 uid=0 result="success" Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.4822] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22782 uid=0 result="success" Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.4898] device (eth1): carrier: link connected Feb 1 02:17:10 localhost NetworkManager[5964]: [1769930230.5126] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22791 uid=0 result="success" Feb 1 02:17:10 localhost ipv6_wait_tentative[22803]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Feb 1 02:17:11 localhost ipv6_wait_tentative[22808]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.5747] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22817 uid=0 result="success" Feb 1 02:17:12 localhost ovs-vsctl[22832]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Feb 1 02:17:12 localhost kernel: device eth1 entered promiscuous mode Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.6770] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22840 uid=0 result="success" Feb 1 02:17:12 localhost ifup[22841]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:12 localhost ifup[22842]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:12 localhost ifup[22843]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.7079] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22849 uid=0 result="success" Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.7484] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22859 uid=0 result="success" Feb 1 02:17:12 localhost ifup[22860]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:12 localhost ifup[22861]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:12 localhost ifup[22862]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.7799] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22868 uid=0 result="success" Feb 1 02:17:12 localhost ovs-vsctl[22871]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Feb 1 02:17:12 localhost kernel: device vlan21 entered promiscuous mode Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.8261] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Feb 1 02:17:12 localhost systemd-udevd[22873]: Network interface NamePolicy= disabled on kernel command line. Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.8468] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22882 uid=0 result="success" Feb 1 02:17:12 localhost NetworkManager[5964]: [1769930232.8684] device (vlan21): carrier: link connected Feb 1 02:17:15 localhost NetworkManager[5964]: [1769930235.9228] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22911 uid=0 result="success" Feb 1 02:17:15 localhost NetworkManager[5964]: [1769930235.9676] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22926 uid=0 result="success" Feb 1 02:17:16 localhost NetworkManager[5964]: [1769930236.0104] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22947 uid=0 result="success" Feb 1 02:17:16 localhost ifup[22948]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:16 localhost ifup[22949]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:16 localhost ifup[22950]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:16 localhost NetworkManager[5964]: [1769930236.0301] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22956 uid=0 result="success" Feb 1 02:17:16 localhost ovs-vsctl[22959]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Feb 1 02:17:16 localhost systemd-udevd[22961]: Network interface NamePolicy= disabled on kernel command line. Feb 1 02:17:16 localhost kernel: device vlan20 entered promiscuous mode Feb 1 02:17:16 localhost NetworkManager[5964]: [1769930236.0712] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Feb 1 02:17:16 localhost NetworkManager[5964]: [1769930236.0900] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22971 uid=0 result="success" Feb 1 02:17:16 localhost NetworkManager[5964]: [1769930236.1049] device (vlan20): carrier: link connected Feb 1 02:17:19 localhost NetworkManager[5964]: [1769930239.1570] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23001 uid=0 result="success" Feb 1 02:17:19 localhost NetworkManager[5964]: [1769930239.2071] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23016 uid=0 result="success" Feb 1 02:17:19 localhost NetworkManager[5964]: [1769930239.2708] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23037 uid=0 result="success" Feb 1 02:17:19 localhost ifup[23038]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:19 localhost ifup[23039]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:19 localhost ifup[23040]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:19 localhost NetworkManager[5964]: [1769930239.3051] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23046 uid=0 result="success" Feb 1 02:17:19 localhost ovs-vsctl[23049]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Feb 1 02:17:19 localhost kernel: device vlan22 entered promiscuous mode Feb 1 02:17:19 localhost NetworkManager[5964]: [1769930239.3401] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Feb 1 02:17:19 localhost systemd-udevd[23051]: Network interface NamePolicy= disabled on kernel command line. Feb 1 02:17:19 localhost NetworkManager[5964]: [1769930239.3703] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23061 uid=0 result="success" Feb 1 02:17:19 localhost NetworkManager[5964]: [1769930239.3927] device (vlan22): carrier: link connected Feb 1 02:17:20 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 1 02:17:22 localhost NetworkManager[5964]: [1769930242.4547] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23091 uid=0 result="success" Feb 1 02:17:22 localhost NetworkManager[5964]: [1769930242.5043] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23106 uid=0 result="success" Feb 1 02:17:22 localhost NetworkManager[5964]: [1769930242.5647] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23127 uid=0 result="success" Feb 1 02:17:22 localhost ifup[23128]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:22 localhost ifup[23129]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:22 localhost ifup[23130]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:22 localhost NetworkManager[5964]: [1769930242.5966] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23136 uid=0 result="success" Feb 1 02:17:22 localhost ovs-vsctl[23139]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Feb 1 02:17:22 localhost kernel: device vlan23 entered promiscuous mode Feb 1 02:17:22 localhost NetworkManager[5964]: [1769930242.6408] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Feb 1 02:17:22 localhost systemd-udevd[23141]: Network interface NamePolicy= disabled on kernel command line. Feb 1 02:17:22 localhost NetworkManager[5964]: [1769930242.6674] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23151 uid=0 result="success" Feb 1 02:17:22 localhost NetworkManager[5964]: [1769930242.6912] device (vlan23): carrier: link connected Feb 1 02:17:25 localhost NetworkManager[5964]: [1769930245.7499] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23181 uid=0 result="success" Feb 1 02:17:25 localhost NetworkManager[5964]: [1769930245.7996] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23196 uid=0 result="success" Feb 1 02:17:25 localhost NetworkManager[5964]: [1769930245.8599] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23217 uid=0 result="success" Feb 1 02:17:25 localhost ifup[23218]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:25 localhost ifup[23219]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:25 localhost ifup[23220]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:25 localhost NetworkManager[5964]: [1769930245.8945] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23226 uid=0 result="success" Feb 1 02:17:25 localhost ovs-vsctl[23229]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Feb 1 02:17:25 localhost kernel: device vlan44 entered promiscuous mode Feb 1 02:17:25 localhost systemd-udevd[23231]: Network interface NamePolicy= disabled on kernel command line. Feb 1 02:17:25 localhost NetworkManager[5964]: [1769930245.9360] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Feb 1 02:17:25 localhost NetworkManager[5964]: [1769930245.9638] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23241 uid=0 result="success" Feb 1 02:17:25 localhost NetworkManager[5964]: [1769930245.9865] device (vlan44): carrier: link connected Feb 1 02:17:29 localhost NetworkManager[5964]: [1769930249.0444] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23271 uid=0 result="success" Feb 1 02:17:29 localhost NetworkManager[5964]: [1769930249.0919] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23286 uid=0 result="success" Feb 1 02:17:29 localhost NetworkManager[5964]: [1769930249.1510] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23307 uid=0 result="success" Feb 1 02:17:29 localhost ifup[23308]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:29 localhost ifup[23309]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:29 localhost ifup[23310]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:29 localhost NetworkManager[5964]: [1769930249.1822] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23316 uid=0 result="success" Feb 1 02:17:29 localhost ovs-vsctl[23319]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Feb 1 02:17:29 localhost NetworkManager[5964]: [1769930249.2402] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23326 uid=0 result="success" Feb 1 02:17:30 localhost NetworkManager[5964]: [1769930250.3021] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23353 uid=0 result="success" Feb 1 02:17:30 localhost NetworkManager[5964]: [1769930250.3544] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23368 uid=0 result="success" Feb 1 02:17:30 localhost NetworkManager[5964]: [1769930250.4191] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23389 uid=0 result="success" Feb 1 02:17:30 localhost ifup[23390]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:30 localhost ifup[23391]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:30 localhost ifup[23392]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:30 localhost NetworkManager[5964]: [1769930250.4535] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23398 uid=0 result="success" Feb 1 02:17:30 localhost ovs-vsctl[23401]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Feb 1 02:17:30 localhost NetworkManager[5964]: [1769930250.5142] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23408 uid=0 result="success" Feb 1 02:17:31 localhost NetworkManager[5964]: [1769930251.5794] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23436 uid=0 result="success" Feb 1 02:17:31 localhost NetworkManager[5964]: [1769930251.6299] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23451 uid=0 result="success" Feb 1 02:17:31 localhost NetworkManager[5964]: [1769930251.6935] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23472 uid=0 result="success" Feb 1 02:17:31 localhost ifup[23473]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:31 localhost ifup[23474]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:31 localhost ifup[23475]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:31 localhost NetworkManager[5964]: [1769930251.7259] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23481 uid=0 result="success" Feb 1 02:17:31 localhost ovs-vsctl[23484]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Feb 1 02:17:31 localhost NetworkManager[5964]: [1769930251.7853] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23491 uid=0 result="success" Feb 1 02:17:32 localhost NetworkManager[5964]: [1769930252.8546] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23519 uid=0 result="success" Feb 1 02:17:32 localhost NetworkManager[5964]: [1769930252.9030] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23534 uid=0 result="success" Feb 1 02:17:32 localhost NetworkManager[5964]: [1769930252.9710] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23555 uid=0 result="success" Feb 1 02:17:32 localhost ifup[23556]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:32 localhost ifup[23557]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:32 localhost ifup[23558]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:33 localhost NetworkManager[5964]: [1769930253.0047] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23564 uid=0 result="success" Feb 1 02:17:33 localhost ovs-vsctl[23567]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Feb 1 02:17:33 localhost NetworkManager[5964]: [1769930253.0670] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23574 uid=0 result="success" Feb 1 02:17:34 localhost NetworkManager[5964]: [1769930254.1302] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23602 uid=0 result="success" Feb 1 02:17:34 localhost NetworkManager[5964]: [1769930254.1789] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23617 uid=0 result="success" Feb 1 02:17:34 localhost NetworkManager[5964]: [1769930254.2450] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23638 uid=0 result="success" Feb 1 02:17:34 localhost ifup[23639]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 1 02:17:34 localhost ifup[23640]: 'network-scripts' will be removed from distribution in near future. Feb 1 02:17:34 localhost ifup[23641]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 1 02:17:34 localhost NetworkManager[5964]: [1769930254.2791] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23647 uid=0 result="success" Feb 1 02:17:34 localhost ovs-vsctl[23650]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Feb 1 02:17:34 localhost NetworkManager[5964]: [1769930254.3394] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23657 uid=0 result="success" Feb 1 02:17:35 localhost NetworkManager[5964]: [1769930255.4065] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23685 uid=0 result="success" Feb 1 02:17:35 localhost NetworkManager[5964]: [1769930255.4595] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23700 uid=0 result="success" Feb 1 02:18:28 localhost python3[23732]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:18:34 localhost python3[23751]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 02:18:34 localhost python3[23767]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 02:18:36 localhost python3[23781]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 02:18:36 localhost python3[23797]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 1 02:18:37 localhost python3[23811]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Feb 1 02:18:38 localhost python3[23826]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005604212.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:18:39 localhost python3[23846]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:18:39 localhost systemd[1]: Starting Hostname Service... Feb 1 02:18:39 localhost systemd[1]: Started Hostname Service. Feb 1 02:18:39 localhost systemd-hostnamed[23850]: Hostname set to (static) Feb 1 02:18:39 localhost NetworkManager[5964]: [1769930319.1699] hostname: static hostname changed from "np0005604212.novalocal" to "np0005604212.localdomain" Feb 1 02:18:39 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 1 02:18:39 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 1 02:18:40 localhost systemd[1]: session-10.scope: Deactivated successfully. Feb 1 02:18:40 localhost systemd[1]: session-10.scope: Consumed 1min 45.154s CPU time. Feb 1 02:18:40 localhost systemd-logind[759]: Session 10 logged out. Waiting for processes to exit. Feb 1 02:18:40 localhost systemd-logind[759]: Removed session 10. Feb 1 02:18:43 localhost sshd[23861]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:18:43 localhost systemd-logind[759]: New session 11 of user zuul. Feb 1 02:18:43 localhost systemd[1]: Started Session 11 of User zuul. Feb 1 02:18:43 localhost python3[23878]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Feb 1 02:18:45 localhost systemd[1]: session-11.scope: Deactivated successfully. Feb 1 02:18:45 localhost systemd-logind[759]: Session 11 logged out. Waiting for processes to exit. Feb 1 02:18:45 localhost systemd-logind[759]: Removed session 11. Feb 1 02:18:49 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 1 02:19:09 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 1 02:19:24 localhost sshd[23882]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:19:25 localhost systemd-logind[759]: New session 12 of user zuul. Feb 1 02:19:25 localhost systemd[1]: Started Session 12 of User zuul. Feb 1 02:19:25 localhost python3[23901]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:19:28 localhost systemd[1]: Reloading. Feb 1 02:19:29 localhost systemd-sysv-generator[23944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:19:29 localhost systemd-rc-local-generator[23941]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:19:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:19:29 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Feb 1 02:19:29 localhost systemd[1]: Reloading. Feb 1 02:19:29 localhost systemd-rc-local-generator[23980]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:19:29 localhost systemd-sysv-generator[23984]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:19:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:19:29 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Feb 1 02:19:29 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Feb 1 02:19:29 localhost systemd[1]: Reloading. Feb 1 02:19:29 localhost systemd-rc-local-generator[24025]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:19:29 localhost systemd-sysv-generator[24029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:19:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:19:29 localhost systemd[1]: Listening on LVM2 poll daemon socket. Feb 1 02:19:30 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:19:30 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 02:19:30 localhost systemd[1]: Reloading. Feb 1 02:19:30 localhost systemd-sysv-generator[24093]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:19:30 localhost systemd-rc-local-generator[24090]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:19:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:19:30 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 02:19:30 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:19:30 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 02:19:30 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 02:19:30 localhost systemd[1]: run-r422dc17c40074136825be3ab9654f55c.service: Deactivated successfully. Feb 1 02:19:30 localhost systemd[1]: run-r3c1f7c71ff3f45e48c984ff3d6bc9f29.service: Deactivated successfully. Feb 1 02:20:31 localhost systemd[1]: session-12.scope: Deactivated successfully. Feb 1 02:20:31 localhost systemd[1]: session-12.scope: Consumed 4.793s CPU time. Feb 1 02:20:31 localhost systemd-logind[759]: Session 12 logged out. Waiting for processes to exit. Feb 1 02:20:31 localhost systemd-logind[759]: Removed session 12. Feb 1 02:21:36 localhost sshd[24673]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:23:55 localhost sshd[24675]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:24:52 localhost sshd[24676]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:24:52 localhost sshd[24677]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:27:19 localhost sshd[24679]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:30:09 localhost sshd[24681]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:31:09 localhost sshd[24684]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:32:53 localhost sshd[24687]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:35:34 localhost sshd[24691]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:36:01 localhost sshd[24694]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:36:01 localhost systemd-logind[759]: New session 13 of user zuul. Feb 1 02:36:01 localhost systemd[1]: Started Session 13 of User zuul. Feb 1 02:36:01 localhost python3[24742]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 02:36:03 localhost python3[24829]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:36:06 localhost python3[24846]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:36:07 localhost python3[24862]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:07 localhost kernel: loop: module loaded Feb 1 02:36:07 localhost kernel: loop3: detected capacity change from 0 to 14680064 Feb 1 02:36:07 localhost python3[24887]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:07 localhost lvm[24890]: PV /dev/loop3 not used. Feb 1 02:36:08 localhost lvm[24899]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 1 02:36:08 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Feb 1 02:36:08 localhost lvm[24901]: 1 logical volume(s) in volume group "ceph_vg0" now active Feb 1 02:36:08 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Feb 1 02:36:08 localhost python3[24949]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:36:09 localhost python3[24992]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931368.4287496-54353-280117662720029/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:10 localhost python3[25022]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:36:10 localhost systemd[1]: Reloading. Feb 1 02:36:10 localhost systemd-rc-local-generator[25048]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:36:10 localhost systemd-sysv-generator[25052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:36:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:36:10 localhost systemd[1]: Starting Ceph OSD losetup... Feb 1 02:36:10 localhost bash[25063]: /dev/loop3: [64516]:8401550 (/var/lib/ceph-osd-0.img) Feb 1 02:36:10 localhost systemd[1]: Finished Ceph OSD losetup. Feb 1 02:36:10 localhost lvm[25064]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 1 02:36:10 localhost lvm[25064]: VG ceph_vg0 finished Feb 1 02:36:10 localhost python3[25080]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:36:13 localhost python3[25097]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:36:14 localhost python3[25113]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:14 localhost kernel: loop4: detected capacity change from 0 to 14680064 Feb 1 02:36:14 localhost python3[25135]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:14 localhost lvm[25138]: PV /dev/loop4 not used. Feb 1 02:36:15 localhost lvm[25140]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 1 02:36:15 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Feb 1 02:36:15 localhost lvm[25151]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 1 02:36:15 localhost lvm[25151]: VG ceph_vg1 finished Feb 1 02:36:15 localhost lvm[25149]: 1 logical volume(s) in volume group "ceph_vg1" now active Feb 1 02:36:15 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Feb 1 02:36:15 localhost python3[25199]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:36:16 localhost python3[25242]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931375.4117432-54527-17991884312864/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:16 localhost python3[25272]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:36:16 localhost systemd[1]: Reloading. Feb 1 02:36:16 localhost systemd-rc-local-generator[25298]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:36:16 localhost systemd-sysv-generator[25301]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:36:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:36:17 localhost systemd[1]: Starting Ceph OSD losetup... Feb 1 02:36:17 localhost bash[25314]: /dev/loop4: [64516]:8400144 (/var/lib/ceph-osd-1.img) Feb 1 02:36:17 localhost systemd[1]: Finished Ceph OSD losetup. Feb 1 02:36:17 localhost lvm[25315]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 1 02:36:17 localhost lvm[25315]: VG ceph_vg1 finished Feb 1 02:36:26 localhost python3[25361]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Feb 1 02:36:27 localhost python3[25381]: ansible-hostname Invoked with name=np0005604212.localdomain use=None Feb 1 02:36:27 localhost systemd[1]: Starting Hostname Service... Feb 1 02:36:27 localhost systemd[1]: Started Hostname Service. Feb 1 02:36:29 localhost python3[25404]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Feb 1 02:36:30 localhost python3[25452]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.rv6ht4zgtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:30 localhost python3[25482]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.rv6ht4zgtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:31 localhost python3[25498]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.rv6ht4zgtmphosts insertbefore=BOF block=192.168.122.106 np0005604212.localdomain np0005604212#012192.168.122.106 np0005604212.ctlplane.localdomain np0005604212.ctlplane#012192.168.122.107 np0005604213.localdomain np0005604213#012192.168.122.107 np0005604213.ctlplane.localdomain np0005604213.ctlplane#012192.168.122.108 np0005604215.localdomain np0005604215#012192.168.122.108 np0005604215.ctlplane.localdomain np0005604215.ctlplane#012192.168.122.103 np0005604209.localdomain np0005604209#012192.168.122.103 np0005604209.ctlplane.localdomain np0005604209.ctlplane#012192.168.122.104 np0005604210.localdomain np0005604210#012192.168.122.104 np0005604210.ctlplane.localdomain np0005604210.ctlplane#012192.168.122.105 np0005604211.localdomain np0005604211#012192.168.122.105 np0005604211.ctlplane.localdomain np0005604211.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:32 localhost python3[25514]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.rv6ht4zgtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:32 localhost python3[25531]: ansible-file Invoked with path=/tmp/ansible.rv6ht4zgtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:34 localhost python3[25547]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:35 localhost python3[25565]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:36:39 localhost python3[25614]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:36:40 localhost python3[25659]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931399.4636567-55381-265808654711231/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:41 localhost python3[25689]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:36:43 localhost python3[25707]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 02:36:43 localhost systemd[1]: Stopping NTP client/server... Feb 1 02:36:43 localhost chronyd[765]: chronyd exiting Feb 1 02:36:43 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 1 02:36:43 localhost systemd[1]: Stopped NTP client/server. Feb 1 02:36:43 localhost systemd[1]: chronyd.service: Consumed 125ms CPU time, read 1.9M from disk, written 0B to disk. Feb 1 02:36:43 localhost systemd[1]: Starting NTP client/server... Feb 1 02:36:43 localhost chronyd[25715]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 1 02:36:43 localhost chronyd[25715]: Frequency -30.118 +/- 0.089 ppm read from /var/lib/chrony/drift Feb 1 02:36:43 localhost chronyd[25715]: Loaded seccomp filter (level 2) Feb 1 02:36:43 localhost systemd[1]: Started NTP client/server. Feb 1 02:36:44 localhost python3[25764]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:36:45 localhost python3[25807]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931404.7088075-55612-64737551397015/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:36:46 localhost python3[25837]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:36:46 localhost systemd[1]: Reloading. Feb 1 02:36:46 localhost systemd-rc-local-generator[25862]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:36:46 localhost systemd-sysv-generator[25867]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:36:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:36:46 localhost systemd[1]: Reloading. Feb 1 02:36:46 localhost systemd-sysv-generator[25905]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:36:46 localhost systemd-rc-local-generator[25899]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:36:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:36:46 localhost systemd[1]: Starting chronyd online sources service... Feb 1 02:36:46 localhost chronyc[25913]: 200 OK Feb 1 02:36:46 localhost systemd[1]: chrony-online.service: Deactivated successfully. Feb 1 02:36:46 localhost systemd[1]: Finished chronyd online sources service. Feb 1 02:36:47 localhost python3[25930]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:47 localhost chronyd[25715]: System clock was stepped by 0.000000 seconds Feb 1 02:36:47 localhost chronyd[25715]: Selected source 51.222.111.13 (pool.ntp.org) Feb 1 02:36:47 localhost python3[25947]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:36:57 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 1 02:36:58 localhost python3[25966]: ansible-timezone Invoked with name=UTC hwclock=None Feb 1 02:36:58 localhost systemd[1]: Starting Time & Date Service... Feb 1 02:36:58 localhost systemd[1]: Started Time & Date Service. Feb 1 02:36:59 localhost python3[25986]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 02:36:59 localhost chronyd[25715]: chronyd exiting Feb 1 02:36:59 localhost systemd[1]: Stopping NTP client/server... Feb 1 02:36:59 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 1 02:36:59 localhost systemd[1]: Stopped NTP client/server. Feb 1 02:36:59 localhost systemd[1]: Starting NTP client/server... Feb 1 02:36:59 localhost chronyd[25994]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 1 02:36:59 localhost chronyd[25994]: Frequency -30.118 +/- 0.093 ppm read from /var/lib/chrony/drift Feb 1 02:36:59 localhost chronyd[25994]: Loaded seccomp filter (level 2) Feb 1 02:36:59 localhost systemd[1]: Started NTP client/server. Feb 1 02:37:03 localhost chronyd[25994]: Selected source 51.222.111.13 (pool.ntp.org) Feb 1 02:37:28 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 1 02:38:12 localhost sshd[26192]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:56 localhost sshd[26194]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:57 localhost systemd[1]: Created slice User Slice of UID 1002. Feb 1 02:38:57 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Feb 1 02:38:57 localhost systemd-logind[759]: New session 14 of user ceph-admin. Feb 1 02:38:57 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Feb 1 02:38:57 localhost systemd[1]: Starting User Manager for UID 1002... Feb 1 02:38:57 localhost systemd[26198]: Queued start job for default target Main User Target. Feb 1 02:38:57 localhost systemd[26198]: Created slice User Application Slice. Feb 1 02:38:57 localhost systemd[26198]: Started Mark boot as successful after the user session has run 2 minutes. Feb 1 02:38:57 localhost systemd[26198]: Started Daily Cleanup of User's Temporary Directories. Feb 1 02:38:57 localhost systemd[26198]: Reached target Paths. Feb 1 02:38:57 localhost systemd[26198]: Reached target Timers. Feb 1 02:38:57 localhost systemd[26198]: Starting D-Bus User Message Bus Socket... Feb 1 02:38:57 localhost systemd[26198]: Starting Create User's Volatile Files and Directories... Feb 1 02:38:57 localhost sshd[26211]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:57 localhost systemd[26198]: Finished Create User's Volatile Files and Directories. Feb 1 02:38:57 localhost systemd[26198]: Listening on D-Bus User Message Bus Socket. Feb 1 02:38:57 localhost systemd[26198]: Reached target Sockets. Feb 1 02:38:57 localhost systemd[26198]: Reached target Basic System. Feb 1 02:38:57 localhost systemd[26198]: Reached target Main User Target. Feb 1 02:38:57 localhost systemd[26198]: Startup finished in 92ms. Feb 1 02:38:57 localhost systemd[1]: Started User Manager for UID 1002. Feb 1 02:38:57 localhost systemd[1]: Started Session 14 of User ceph-admin. Feb 1 02:38:57 localhost systemd-logind[759]: New session 16 of user ceph-admin. Feb 1 02:38:57 localhost systemd[1]: Started Session 16 of User ceph-admin. Feb 1 02:38:57 localhost sshd[26233]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:57 localhost systemd-logind[759]: New session 17 of user ceph-admin. Feb 1 02:38:57 localhost systemd[1]: Started Session 17 of User ceph-admin. Feb 1 02:38:57 localhost sshd[26252]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:58 localhost systemd-logind[759]: New session 18 of user ceph-admin. Feb 1 02:38:58 localhost systemd[1]: Started Session 18 of User ceph-admin. Feb 1 02:38:58 localhost sshd[26271]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:58 localhost systemd-logind[759]: New session 19 of user ceph-admin. Feb 1 02:38:58 localhost systemd[1]: Started Session 19 of User ceph-admin. Feb 1 02:38:58 localhost sshd[26290]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:58 localhost systemd-logind[759]: New session 20 of user ceph-admin. Feb 1 02:38:58 localhost systemd[1]: Started Session 20 of User ceph-admin. Feb 1 02:38:59 localhost sshd[26309]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:59 localhost systemd-logind[759]: New session 21 of user ceph-admin. Feb 1 02:38:59 localhost systemd[1]: Started Session 21 of User ceph-admin. Feb 1 02:38:59 localhost sshd[26328]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:38:59 localhost systemd-logind[759]: New session 22 of user ceph-admin. Feb 1 02:38:59 localhost systemd[1]: Started Session 22 of User ceph-admin. Feb 1 02:38:59 localhost sshd[26347]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:39:00 localhost systemd-logind[759]: New session 23 of user ceph-admin. Feb 1 02:39:00 localhost systemd[1]: Started Session 23 of User ceph-admin. Feb 1 02:39:00 localhost sshd[26366]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:39:00 localhost systemd-logind[759]: New session 24 of user ceph-admin. Feb 1 02:39:00 localhost systemd[1]: Started Session 24 of User ceph-admin. Feb 1 02:39:00 localhost sshd[26383]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:39:01 localhost systemd-logind[759]: New session 25 of user ceph-admin. Feb 1 02:39:01 localhost systemd[1]: Started Session 25 of User ceph-admin. Feb 1 02:39:01 localhost sshd[26402]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:39:01 localhost systemd-logind[759]: New session 26 of user ceph-admin. Feb 1 02:39:01 localhost systemd[1]: Started Session 26 of User ceph-admin. Feb 1 02:39:01 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:26 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:26 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:27 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:27 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:27 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26619 (sysctl) Feb 1 02:39:27 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Feb 1 02:39:27 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Feb 1 02:39:28 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:28 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:28 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:31 localhost kernel: VFS: idmapped mount is not enabled. Feb 1 02:39:52 localhost podman[26763]: 2026-02-01 07:39:28.838921012 +0000 UTC m=+0.040110673 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:39:52 localhost podman[26763]: Feb 1 02:39:52 localhost podman[26763]: 2026-02-01 07:39:52.03989066 +0000 UTC m=+23.241080301 container create 9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_dubinsky, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7, build-date=2025-12-08T17:28:53Z, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=) Feb 1 02:39:52 localhost systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck3569238455-merged.mount: Deactivated successfully. Feb 1 02:39:52 localhost systemd[1]: Created slice Slice /machine. Feb 1 02:39:52 localhost systemd[1]: Started libpod-conmon-9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63.scope. Feb 1 02:39:52 localhost systemd[1]: Started libcrun container. Feb 1 02:39:52 localhost podman[26763]: 2026-02-01 07:39:52.19159582 +0000 UTC m=+23.392785471 container init 9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_dubinsky, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , RELEASE=main, io.buildah.version=1.41.4, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1764794109, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 02:39:52 localhost systemd[1]: tmp-crun.ZiZMJi.mount: Deactivated successfully. Feb 1 02:39:52 localhost podman[26763]: 2026-02-01 07:39:52.202193463 +0000 UTC m=+23.403383104 container start 9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_dubinsky, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, name=rhceph, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4) Feb 1 02:39:52 localhost podman[26763]: 2026-02-01 07:39:52.202440941 +0000 UTC m=+23.403630632 container attach 9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_dubinsky, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, name=rhceph, GIT_CLEAN=True, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux ) Feb 1 02:39:52 localhost recursing_dubinsky[26864]: 167 167 Feb 1 02:39:52 localhost systemd[1]: libpod-9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63.scope: Deactivated successfully. Feb 1 02:39:52 localhost podman[26763]: 2026-02-01 07:39:52.207036891 +0000 UTC m=+23.408226582 container died 9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_dubinsky, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, version=7, release=1764794109, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:39:52 localhost podman[26869]: 2026-02-01 07:39:52.292920436 +0000 UTC m=+0.077521451 container remove 9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_dubinsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, release=1764794109, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, version=7) Feb 1 02:39:52 localhost systemd[1]: libpod-conmon-9dd0f720c1fdaf1f7036ad35bc870ad28b7fac3b3b71571322ba93cba7469a63.scope: Deactivated successfully. Feb 1 02:39:52 localhost podman[26891]: Feb 1 02:39:52 localhost podman[26891]: 2026-02-01 07:39:52.496001062 +0000 UTC m=+0.044080634 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:39:53 localhost systemd[1]: var-lib-containers-storage-overlay-3a996f53387187daa3e4a1db80976a44df1b79e05b89ec4f4b164deef32e4e0b-merged.mount: Deactivated successfully. Feb 1 02:39:55 localhost podman[26891]: 2026-02-01 07:39:55.302242857 +0000 UTC m=+2.850322419 container create 131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_lederberg, ceph=True, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 1 02:39:55 localhost systemd[1]: Started libpod-conmon-131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458.scope. Feb 1 02:39:55 localhost systemd[1]: Started libcrun container. Feb 1 02:39:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23cdbb1fdc29abbefbd37e3b946562e49695b142a53ed0613d65574096bedfcb/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:39:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23cdbb1fdc29abbefbd37e3b946562e49695b142a53ed0613d65574096bedfcb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:39:55 localhost podman[26891]: 2026-02-01 07:39:55.572275231 +0000 UTC m=+3.120354783 container init 131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_lederberg, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, version=7, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 02:39:55 localhost systemd[1]: tmp-crun.MJF28r.mount: Deactivated successfully. Feb 1 02:39:55 localhost podman[26891]: 2026-02-01 07:39:55.583715959 +0000 UTC m=+3.131795511 container start 131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_lederberg, ceph=True, RELEASE=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc.) Feb 1 02:39:55 localhost podman[26891]: 2026-02-01 07:39:55.584317087 +0000 UTC m=+3.132396839 container attach 131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_lederberg, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, ceph=True, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Feb 1 02:39:56 localhost strange_lederberg[27073]: [ Feb 1 02:39:56 localhost strange_lederberg[27073]: { Feb 1 02:39:56 localhost strange_lederberg[27073]: "available": false, Feb 1 02:39:56 localhost strange_lederberg[27073]: "ceph_device": false, Feb 1 02:39:56 localhost strange_lederberg[27073]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 1 02:39:56 localhost strange_lederberg[27073]: "lsm_data": {}, Feb 1 02:39:56 localhost strange_lederberg[27073]: "lvs": [], Feb 1 02:39:56 localhost strange_lederberg[27073]: "path": "/dev/sr0", Feb 1 02:39:56 localhost strange_lederberg[27073]: "rejected_reasons": [ Feb 1 02:39:56 localhost strange_lederberg[27073]: "Insufficient space (<5GB)", Feb 1 02:39:56 localhost strange_lederberg[27073]: "Has a FileSystem" Feb 1 02:39:56 localhost strange_lederberg[27073]: ], Feb 1 02:39:56 localhost strange_lederberg[27073]: "sys_api": { Feb 1 02:39:56 localhost strange_lederberg[27073]: "actuators": null, Feb 1 02:39:56 localhost strange_lederberg[27073]: "device_nodes": "sr0", Feb 1 02:39:56 localhost strange_lederberg[27073]: "human_readable_size": "482.00 KB", Feb 1 02:39:56 localhost strange_lederberg[27073]: "id_bus": "ata", Feb 1 02:39:56 localhost strange_lederberg[27073]: "model": "QEMU DVD-ROM", Feb 1 02:39:56 localhost strange_lederberg[27073]: "nr_requests": "2", Feb 1 02:39:56 localhost strange_lederberg[27073]: "partitions": {}, Feb 1 02:39:56 localhost strange_lederberg[27073]: "path": "/dev/sr0", Feb 1 02:39:56 localhost strange_lederberg[27073]: "removable": "1", Feb 1 02:39:56 localhost strange_lederberg[27073]: "rev": "2.5+", Feb 1 02:39:56 localhost strange_lederberg[27073]: "ro": "0", Feb 1 02:39:56 localhost strange_lederberg[27073]: "rotational": "1", Feb 1 02:39:56 localhost strange_lederberg[27073]: "sas_address": "", Feb 1 02:39:56 localhost strange_lederberg[27073]: "sas_device_handle": "", Feb 1 02:39:56 localhost strange_lederberg[27073]: "scheduler_mode": "mq-deadline", Feb 1 02:39:56 localhost strange_lederberg[27073]: "sectors": 0, Feb 1 02:39:56 localhost strange_lederberg[27073]: "sectorsize": "2048", Feb 1 02:39:56 localhost strange_lederberg[27073]: "size": 493568.0, Feb 1 02:39:56 localhost strange_lederberg[27073]: "support_discard": "0", Feb 1 02:39:56 localhost strange_lederberg[27073]: "type": "disk", Feb 1 02:39:56 localhost strange_lederberg[27073]: "vendor": "QEMU" Feb 1 02:39:56 localhost strange_lederberg[27073]: } Feb 1 02:39:56 localhost strange_lederberg[27073]: } Feb 1 02:39:56 localhost strange_lederberg[27073]: ] Feb 1 02:39:56 localhost systemd[1]: libpod-131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458.scope: Deactivated successfully. Feb 1 02:39:56 localhost podman[26891]: 2026-02-01 07:39:56.383850741 +0000 UTC m=+3.931930293 container died 131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_lederberg, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, version=7, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public) Feb 1 02:39:56 localhost systemd[1]: var-lib-containers-storage-overlay-23cdbb1fdc29abbefbd37e3b946562e49695b142a53ed0613d65574096bedfcb-merged.mount: Deactivated successfully. Feb 1 02:39:56 localhost podman[28203]: 2026-02-01 07:39:56.461521756 +0000 UTC m=+0.066975230 container remove 131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_lederberg, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., release=1764794109, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public) Feb 1 02:39:56 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:39:56 localhost systemd[1]: libpod-conmon-131a942655b0ec4b129666359ca543e49bd9190e7e3976f873e82c0112ec2458.scope: Deactivated successfully. Feb 1 02:39:57 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Feb 1 02:39:57 localhost systemd[1]: Closed Process Core Dump Socket. Feb 1 02:39:57 localhost systemd[1]: Stopping Process Core Dump Socket... Feb 1 02:39:57 localhost systemd[1]: Listening on Process Core Dump Socket. Feb 1 02:39:57 localhost systemd[1]: Reloading. Feb 1 02:39:57 localhost systemd-rc-local-generator[28282]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:39:57 localhost systemd-sysv-generator[28288]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:39:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:39:57 localhost systemd[1]: Reloading. Feb 1 02:39:57 localhost systemd-rc-local-generator[28318]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:39:57 localhost systemd-sysv-generator[28322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:39:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:17 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:40:18 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:40:18 localhost podman[28406]: Feb 1 02:40:18 localhost podman[28406]: 2026-02-01 07:40:18.118533017 +0000 UTC m=+0.044313251 container create bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_yonath, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:18 localhost systemd[1]: Started libpod-conmon-bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2.scope. Feb 1 02:40:18 localhost systemd[1]: Started libcrun container. Feb 1 02:40:18 localhost podman[28406]: 2026-02-01 07:40:18.17672977 +0000 UTC m=+0.102510024 container init bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_yonath, release=1764794109, name=rhceph, distribution-scope=public, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 02:40:18 localhost podman[28406]: 2026-02-01 07:40:18.186000332 +0000 UTC m=+0.111780596 container start bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_yonath, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1764794109, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z) Feb 1 02:40:18 localhost podman[28406]: 2026-02-01 07:40:18.187837808 +0000 UTC m=+0.113618082 container attach bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_yonath, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 02:40:18 localhost hardcore_yonath[28422]: 167 167 Feb 1 02:40:18 localhost systemd[1]: libpod-bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2.scope: Deactivated successfully. Feb 1 02:40:18 localhost podman[28406]: 2026-02-01 07:40:18.19021205 +0000 UTC m=+0.115992304 container died bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_yonath, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4) Feb 1 02:40:18 localhost podman[28406]: 2026-02-01 07:40:18.104804449 +0000 UTC m=+0.030584693 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:18 localhost podman[28427]: 2026-02-01 07:40:18.266541035 +0000 UTC m=+0.071103656 container remove bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_yonath, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=7, build-date=2025-12-08T17:28:53Z) Feb 1 02:40:18 localhost systemd[1]: libpod-conmon-bab68aeb1515a3f6f197da8da9d2860359dfed0ea495e8d810ead59d1461daf2.scope: Deactivated successfully. Feb 1 02:40:18 localhost systemd[1]: Reloading. Feb 1 02:40:18 localhost systemd-rc-local-generator[28466]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:18 localhost systemd-sysv-generator[28470]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:18 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:40:18 localhost systemd[1]: Reloading. Feb 1 02:40:18 localhost systemd-rc-local-generator[28505]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:18 localhost systemd-sysv-generator[28508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:18 localhost systemd[1]: Reached target All Ceph clusters and services. Feb 1 02:40:18 localhost systemd[1]: Reloading. Feb 1 02:40:18 localhost systemd-rc-local-generator[28547]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:18 localhost systemd-sysv-generator[28550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:19 localhost systemd[1]: Reached target Ceph cluster 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 02:40:19 localhost systemd[1]: Reloading. Feb 1 02:40:19 localhost systemd-rc-local-generator[28584]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:19 localhost systemd-sysv-generator[28587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:19 localhost systemd[1]: Reloading. Feb 1 02:40:19 localhost systemd-sysv-generator[28626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:19 localhost systemd-rc-local-generator[28622]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:19 localhost systemd[1]: Created slice Slice /system/ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 02:40:19 localhost systemd[1]: Reached target System Time Set. Feb 1 02:40:19 localhost systemd[1]: Reached target System Time Synchronized. Feb 1 02:40:19 localhost systemd[1]: Starting Ceph crash.np0005604212 for 33fac0b9-80c7-560f-918a-c92d3021ca1e... Feb 1 02:40:19 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:40:19 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 1 02:40:19 localhost podman[28688]: Feb 1 02:40:19 localhost podman[28688]: 2026-02-01 07:40:19.796818565 +0000 UTC m=+0.064085833 container create f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1764794109, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_CLEAN=True) Feb 1 02:40:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dede51c0dc27eec6dc40e9e2b6ff5fd84827c29ff1642cf38b3d7051c4f090c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dede51c0dc27eec6dc40e9e2b6ff5fd84827c29ff1642cf38b3d7051c4f090c/merged/etc/ceph/ceph.client.crash.np0005604212.keyring supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dede51c0dc27eec6dc40e9e2b6ff5fd84827c29ff1642cf38b3d7051c4f090c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:19 localhost podman[28688]: 2026-02-01 07:40:19.872021456 +0000 UTC m=+0.139288954 container init f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, architecture=x86_64, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_BRANCH=main, maintainer=Guillaume Abrioux , release=1764794109, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 02:40:19 localhost podman[28688]: 2026-02-01 07:40:19.774882147 +0000 UTC m=+0.042149435 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:19 localhost podman[28688]: 2026-02-01 07:40:19.881360611 +0000 UTC m=+0.148627899 container start f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , name=rhceph, description=Red Hat Ceph Storage 7, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Feb 1 02:40:19 localhost bash[28688]: f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 Feb 1 02:40:19 localhost systemd[1]: Started Ceph crash.np0005604212 for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 02:40:19 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: INFO:ceph-crash:pinging cluster to exercise our key Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.059+0000 7f0e88711640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.059+0000 7f0e88711640 -1 AuthRegistry(0x7f0e80068980) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.060+0000 7f0e88711640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.060+0000 7f0e88711640 -1 AuthRegistry(0x7f0e88710000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.067+0000 7f0e86c87640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.068+0000 7f0e85c85640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.070+0000 7f0e86486640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: 2026-02-01T07:40:20.071+0000 7f0e88711640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: [errno 13] RADOS permission denied (error connecting to the cluster) Feb 1 02:40:20 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212[28703]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Feb 1 02:40:29 localhost podman[28788]: Feb 1 02:40:29 localhost podman[28788]: 2026-02-01 07:40:29.580656024 +0000 UTC m=+0.057516852 container create 0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wescoff, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, RELEASE=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container) Feb 1 02:40:29 localhost systemd[1]: Started libpod-conmon-0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c.scope. Feb 1 02:40:29 localhost systemd[1]: Started libcrun container. Feb 1 02:40:29 localhost podman[28788]: 2026-02-01 07:40:29.649210634 +0000 UTC m=+0.126071442 container init 0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wescoff, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=) Feb 1 02:40:29 localhost podman[28788]: 2026-02-01 07:40:29.553032462 +0000 UTC m=+0.029893310 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:29 localhost podman[28788]: 2026-02-01 07:40:29.66008097 +0000 UTC m=+0.136941818 container start 0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wescoff, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, release=1764794109, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=) Feb 1 02:40:29 localhost podman[28788]: 2026-02-01 07:40:29.660379838 +0000 UTC m=+0.137240656 container attach 0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wescoff, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, release=1764794109, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-12-08T17:28:53Z) Feb 1 02:40:29 localhost awesome_wescoff[28804]: 167 167 Feb 1 02:40:29 localhost systemd[1]: libpod-0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c.scope: Deactivated successfully. Feb 1 02:40:29 localhost podman[28788]: 2026-02-01 07:40:29.665096478 +0000 UTC m=+0.141957296 container died 0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wescoff, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 02:40:29 localhost systemd[1]: var-lib-containers-storage-overlay-e65736131975e98b61881daed8dac7fa2b0573e1b9d677c394117eb6854a1e98-merged.mount: Deactivated successfully. Feb 1 02:40:29 localhost podman[28809]: 2026-02-01 07:40:29.744545535 +0000 UTC m=+0.071017244 container remove 0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wescoff, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, build-date=2025-12-08T17:28:53Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1764794109, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 02:40:29 localhost systemd[1]: libpod-conmon-0589f807c0cb84a1570bc8b78b1b095478d2cf3d302481f14877a181a5602b4c.scope: Deactivated successfully. Feb 1 02:40:29 localhost podman[28829]: Feb 1 02:40:29 localhost podman[28829]: 2026-02-01 07:40:29.93848459 +0000 UTC m=+0.056413254 container create 81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_feynman, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, architecture=x86_64, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 02:40:29 localhost systemd[1]: Started libpod-conmon-81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2.scope. Feb 1 02:40:30 localhost systemd[1]: Started libcrun container. Feb 1 02:40:30 localhost podman[28829]: 2026-02-01 07:40:29.909139494 +0000 UTC m=+0.027068168 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9769681ad3fecc22b506ff58ced1fb17c46b7c29c175665ffb5564b36530479f/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9769681ad3fecc22b506ff58ced1fb17c46b7c29c175665ffb5564b36530479f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9769681ad3fecc22b506ff58ced1fb17c46b7c29c175665ffb5564b36530479f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9769681ad3fecc22b506ff58ced1fb17c46b7c29c175665ffb5564b36530479f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9769681ad3fecc22b506ff58ced1fb17c46b7c29c175665ffb5564b36530479f/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:30 localhost podman[28829]: 2026-02-01 07:40:30.073999931 +0000 UTC m=+0.191928595 container init 81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_feynman, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1764794109, architecture=x86_64) Feb 1 02:40:30 localhost podman[28829]: 2026-02-01 07:40:30.085087332 +0000 UTC m=+0.203015996 container start 81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_feynman, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, ceph=True, release=1764794109, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph) Feb 1 02:40:30 localhost podman[28829]: 2026-02-01 07:40:30.085358379 +0000 UTC m=+0.203287043 container attach 81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_feynman, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z) Feb 1 02:40:30 localhost beautiful_feynman[28844]: --> passed data devices: 0 physical, 2 LVM Feb 1 02:40:30 localhost beautiful_feynman[28844]: --> relative data size: 1.0 Feb 1 02:40:30 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 1 02:40:30 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new a13401ab-1442-4562-869f-232d5c267bec Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 1 02:40:31 localhost lvm[28898]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 1 02:40:31 localhost lvm[28898]: VG ceph_vg0 finished Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1 Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap Feb 1 02:40:31 localhost beautiful_feynman[28844]: stderr: got monmap epoch 3 Feb 1 02:40:31 localhost beautiful_feynman[28844]: --> Creating keyring file for osd.1 Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/ Feb 1 02:40:31 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid a13401ab-1442-4562-869f-232d5c267bec --setuser ceph --setgroup ceph Feb 1 02:40:34 localhost beautiful_feynman[28844]: stderr: 2026-02-01T07:40:31.683+0000 7f3ae3901a80 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Feb 1 02:40:34 localhost beautiful_feynman[28844]: stderr: 2026-02-01T07:40:31.683+0000 7f3ae3901a80 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid Feb 1 02:40:34 localhost beautiful_feynman[28844]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-1 --no-mon-config Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 Feb 1 02:40:34 localhost beautiful_feynman[28844]: --> ceph-volume lvm activate successful for osd ID: 1 Feb 1 02:40:34 localhost beautiful_feynman[28844]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 1dcce331-c114-4a5e-9a5b-585d78664562 Feb 1 02:40:34 localhost lvm[29843]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 1 02:40:34 localhost lvm[29843]: VG ceph_vg1 finished Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-4 Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:34 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-4/activate.monmap Feb 1 02:40:35 localhost beautiful_feynman[28844]: stderr: got monmap epoch 3 Feb 1 02:40:35 localhost beautiful_feynman[28844]: --> Creating keyring file for osd.4 Feb 1 02:40:35 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/keyring Feb 1 02:40:35 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4/ Feb 1 02:40:35 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 4 --monmap /var/lib/ceph/osd/ceph-4/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-4/ --osd-uuid 1dcce331-c114-4a5e-9a5b-585d78664562 --setuser ceph --setgroup ceph Feb 1 02:40:37 localhost beautiful_feynman[28844]: stderr: 2026-02-01T07:40:35.455+0000 7f54f8f4fa80 -1 bluestore(/var/lib/ceph/osd/ceph-4//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Feb 1 02:40:37 localhost beautiful_feynman[28844]: stderr: 2026-02-01T07:40:35.455+0000 7f54f8f4fa80 -1 bluestore(/var/lib/ceph/osd/ceph-4/) _read_fsid unparsable uuid Feb 1 02:40:37 localhost beautiful_feynman[28844]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Feb 1 02:40:37 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Feb 1 02:40:37 localhost beautiful_feynman[28844]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-4 --no-mon-config Feb 1 02:40:38 localhost beautiful_feynman[28844]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:38 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:38 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 1 02:40:38 localhost beautiful_feynman[28844]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Feb 1 02:40:38 localhost beautiful_feynman[28844]: --> ceph-volume lvm activate successful for osd ID: 4 Feb 1 02:40:38 localhost beautiful_feynman[28844]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Feb 1 02:40:38 localhost systemd[1]: libpod-81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2.scope: Deactivated successfully. Feb 1 02:40:38 localhost systemd[1]: libpod-81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2.scope: Consumed 3.694s CPU time. Feb 1 02:40:38 localhost podman[30757]: 2026-02-01 07:40:38.156751447 +0000 UTC m=+0.040598482 container died 81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_feynman, vendor=Red Hat, Inc., RELEASE=main, version=7, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, release=1764794109, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4) Feb 1 02:40:38 localhost systemd[1]: var-lib-containers-storage-overlay-9769681ad3fecc22b506ff58ced1fb17c46b7c29c175665ffb5564b36530479f-merged.mount: Deactivated successfully. Feb 1 02:40:38 localhost podman[30757]: 2026-02-01 07:40:38.190196986 +0000 UTC m=+0.074044001 container remove 81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_feynman, RELEASE=main, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, description=Red Hat Ceph Storage 7, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, version=7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Feb 1 02:40:38 localhost systemd[1]: libpod-conmon-81aa3ef0b73e0712ede9c01a5cf047849f6484a06073eda38c25086d3915a0c2.scope: Deactivated successfully. Feb 1 02:40:38 localhost podman[30838]: Feb 1 02:40:39 localhost podman[30838]: 2026-02-01 07:40:39.003022155 +0000 UTC m=+0.078763571 container create 63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_raman, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, release=1764794109, com.redhat.component=rhceph-container, ceph=True, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 02:40:39 localhost systemd[1]: Started libpod-conmon-63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624.scope. Feb 1 02:40:39 localhost systemd[1]: Started libcrun container. Feb 1 02:40:39 localhost podman[30838]: 2026-02-01 07:40:39.061199412 +0000 UTC m=+0.136940858 container init 63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_raman, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., architecture=x86_64, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=) Feb 1 02:40:39 localhost podman[30838]: 2026-02-01 07:40:39.069337159 +0000 UTC m=+0.145078575 container start 63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_raman, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Feb 1 02:40:39 localhost podman[30838]: 2026-02-01 07:40:38.97012399 +0000 UTC m=+0.045865446 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:39 localhost podman[30838]: 2026-02-01 07:40:39.069611826 +0000 UTC m=+0.145353282 container attach 63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_raman, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, ceph=True, RELEASE=main, GIT_CLEAN=True, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 02:40:39 localhost serene_raman[30851]: 167 167 Feb 1 02:40:39 localhost systemd[1]: libpod-63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624.scope: Deactivated successfully. Feb 1 02:40:39 localhost podman[30838]: 2026-02-01 07:40:39.075193038 +0000 UTC m=+0.150934464 container died 63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_raman, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:39 localhost podman[30858]: 2026-02-01 07:40:39.165797378 +0000 UTC m=+0.076028581 container remove 63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_raman, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, maintainer=Guillaume Abrioux , version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, release=1764794109, architecture=x86_64, vendor=Red Hat, Inc.) Feb 1 02:40:39 localhost systemd[1]: var-lib-containers-storage-overlay-5e087f9f45712b8182d3597f74aaf74b70269684f154b460bf2f21a9fce93810-merged.mount: Deactivated successfully. Feb 1 02:40:39 localhost systemd[1]: libpod-conmon-63a5f677c212ba12b216b05fd307ef3f26d3a6b06769d42911b7c0fb402fb624.scope: Deactivated successfully. Feb 1 02:40:39 localhost podman[30879]: Feb 1 02:40:39 localhost podman[30879]: 2026-02-01 07:40:39.379802553 +0000 UTC m=+0.076064853 container create 303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, GIT_CLEAN=True, version=7, architecture=x86_64, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True) Feb 1 02:40:39 localhost systemd[1]: Started libpod-conmon-303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815.scope. Feb 1 02:40:39 localhost podman[30879]: 2026-02-01 07:40:39.351201157 +0000 UTC m=+0.047463477 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:39 localhost systemd[1]: Started libcrun container. Feb 1 02:40:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcfac403a390230ff1452bbbd142d261c01e53fc4d04463f73003a4df1e76c52/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcfac403a390230ff1452bbbd142d261c01e53fc4d04463f73003a4df1e76c52/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcfac403a390230ff1452bbbd142d261c01e53fc4d04463f73003a4df1e76c52/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:39 localhost podman[30879]: 2026-02-01 07:40:39.502513449 +0000 UTC m=+0.198775749 container init 303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1764794109, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public) Feb 1 02:40:39 localhost podman[30879]: 2026-02-01 07:40:39.513209439 +0000 UTC m=+0.209471739 container start 303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, release=1764794109) Feb 1 02:40:39 localhost podman[30879]: 2026-02-01 07:40:39.513480587 +0000 UTC m=+0.209742937 container attach 303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container) Feb 1 02:40:39 localhost stoic_villani[30894]: { Feb 1 02:40:39 localhost stoic_villani[30894]: "1": [ Feb 1 02:40:39 localhost stoic_villani[30894]: { Feb 1 02:40:39 localhost stoic_villani[30894]: "devices": [ Feb 1 02:40:39 localhost stoic_villani[30894]: "/dev/loop3" Feb 1 02:40:39 localhost stoic_villani[30894]: ], Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_name": "ceph_lv0", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_size": "7511998464", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=p5jz4R-7HYA-VQNc-u2Up-sTzM-oKP7-5I1KkT,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=33fac0b9-80c7-560f-918a-c92d3021ca1e,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=a13401ab-1442-4562-869f-232d5c267bec,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_uuid": "p5jz4R-7HYA-VQNc-u2Up-sTzM-oKP7-5I1KkT", Feb 1 02:40:39 localhost stoic_villani[30894]: "name": "ceph_lv0", Feb 1 02:40:39 localhost stoic_villani[30894]: "path": "/dev/ceph_vg0/ceph_lv0", Feb 1 02:40:39 localhost stoic_villani[30894]: "tags": { Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.block_uuid": "p5jz4R-7HYA-VQNc-u2Up-sTzM-oKP7-5I1KkT", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.cephx_lockbox_secret": "", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.cluster_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.cluster_name": "ceph", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.crush_device_class": "", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.encrypted": "0", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.osd_fsid": "a13401ab-1442-4562-869f-232d5c267bec", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.osd_id": "1", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.osdspec_affinity": "default_drive_group", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.type": "block", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.vdo": "0" Feb 1 02:40:39 localhost stoic_villani[30894]: }, Feb 1 02:40:39 localhost stoic_villani[30894]: "type": "block", Feb 1 02:40:39 localhost stoic_villani[30894]: "vg_name": "ceph_vg0" Feb 1 02:40:39 localhost stoic_villani[30894]: } Feb 1 02:40:39 localhost stoic_villani[30894]: ], Feb 1 02:40:39 localhost stoic_villani[30894]: "4": [ Feb 1 02:40:39 localhost stoic_villani[30894]: { Feb 1 02:40:39 localhost stoic_villani[30894]: "devices": [ Feb 1 02:40:39 localhost stoic_villani[30894]: "/dev/loop4" Feb 1 02:40:39 localhost stoic_villani[30894]: ], Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_name": "ceph_lv1", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_size": "7511998464", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=FvxiXn-oJ9G-T2of-RkGH-EBOi-u270-56tZl7,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=33fac0b9-80c7-560f-918a-c92d3021ca1e,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=1dcce331-c114-4a5e-9a5b-585d78664562,ceph.osd_id=4,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Feb 1 02:40:39 localhost stoic_villani[30894]: "lv_uuid": "FvxiXn-oJ9G-T2of-RkGH-EBOi-u270-56tZl7", Feb 1 02:40:39 localhost stoic_villani[30894]: "name": "ceph_lv1", Feb 1 02:40:39 localhost stoic_villani[30894]: "path": "/dev/ceph_vg1/ceph_lv1", Feb 1 02:40:39 localhost stoic_villani[30894]: "tags": { Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.block_uuid": "FvxiXn-oJ9G-T2of-RkGH-EBOi-u270-56tZl7", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.cephx_lockbox_secret": "", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.cluster_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.cluster_name": "ceph", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.crush_device_class": "", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.encrypted": "0", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.osd_fsid": "1dcce331-c114-4a5e-9a5b-585d78664562", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.osd_id": "4", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.osdspec_affinity": "default_drive_group", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.type": "block", Feb 1 02:40:39 localhost stoic_villani[30894]: "ceph.vdo": "0" Feb 1 02:40:39 localhost stoic_villani[30894]: }, Feb 1 02:40:39 localhost stoic_villani[30894]: "type": "block", Feb 1 02:40:39 localhost stoic_villani[30894]: "vg_name": "ceph_vg1" Feb 1 02:40:39 localhost stoic_villani[30894]: } Feb 1 02:40:39 localhost stoic_villani[30894]: ] Feb 1 02:40:39 localhost stoic_villani[30894]: } Feb 1 02:40:39 localhost systemd[1]: libpod-303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815.scope: Deactivated successfully. Feb 1 02:40:39 localhost podman[30879]: 2026-02-01 07:40:39.867137287 +0000 UTC m=+0.563399597 container died 303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.openshift.expose-services=, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public) Feb 1 02:40:39 localhost podman[30903]: 2026-02-01 07:40:39.97163931 +0000 UTC m=+0.087980814 container remove 303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4) Feb 1 02:40:39 localhost systemd[1]: libpod-conmon-303752d96dd853cb59de4892008ed039802e64a5dabfb59244f556083bd60815.scope: Deactivated successfully. Feb 1 02:40:40 localhost systemd[1]: tmp-crun.VRIHUR.mount: Deactivated successfully. Feb 1 02:40:40 localhost systemd[1]: var-lib-containers-storage-overlay-dcfac403a390230ff1452bbbd142d261c01e53fc4d04463f73003a4df1e76c52-merged.mount: Deactivated successfully. Feb 1 02:40:40 localhost podman[30987]: Feb 1 02:40:40 localhost podman[30987]: 2026-02-01 07:40:40.754628262 +0000 UTC m=+0.067028863 container create 49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_jang, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_BRANCH=main, ceph=True, name=rhceph, io.openshift.tags=rhceph ceph, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7) Feb 1 02:40:40 localhost systemd[1]: Started libpod-conmon-49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392.scope. Feb 1 02:40:40 localhost systemd[1]: Started libcrun container. Feb 1 02:40:40 localhost podman[30987]: 2026-02-01 07:40:40.820785071 +0000 UTC m=+0.133185682 container init 49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_jang, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7) Feb 1 02:40:40 localhost podman[30987]: 2026-02-01 07:40:40.72622112 +0000 UTC m=+0.038621761 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:40 localhost podman[30987]: 2026-02-01 07:40:40.83014746 +0000 UTC m=+0.142548061 container start 49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_jang, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, version=7, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=) Feb 1 02:40:40 localhost podman[30987]: 2026-02-01 07:40:40.830437097 +0000 UTC m=+0.142837748 container attach 49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_jang, name=rhceph, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=1764794109, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, distribution-scope=public) Feb 1 02:40:40 localhost elated_jang[31002]: 167 167 Feb 1 02:40:40 localhost systemd[1]: libpod-49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392.scope: Deactivated successfully. Feb 1 02:40:40 localhost podman[30987]: 2026-02-01 07:40:40.83290664 +0000 UTC m=+0.145307281 container died 49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_jang, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, vcs-type=git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True) Feb 1 02:40:40 localhost podman[31007]: 2026-02-01 07:40:40.92270687 +0000 UTC m=+0.076439033 container remove 49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_jang, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 02:40:40 localhost systemd[1]: libpod-conmon-49cb4804a2c51b21dad642fe6d11c088822ee72ff4112e82828e097d81f71392.scope: Deactivated successfully. Feb 1 02:40:41 localhost systemd[1]: var-lib-containers-storage-overlay-f322a203264fc68a7ee1e95f876bd66e26d82ab4348d8cb82bd598b04d23c9f5-merged.mount: Deactivated successfully. Feb 1 02:40:41 localhost podman[31036]: Feb 1 02:40:41 localhost podman[31036]: 2026-02-01 07:40:41.251021046 +0000 UTC m=+0.066659154 container create c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test, io.openshift.expose-services=, ceph=True, release=1764794109, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 02:40:41 localhost systemd[1]: Started libpod-conmon-c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4.scope. Feb 1 02:40:41 localhost systemd[1]: Started libcrun container. Feb 1 02:40:41 localhost podman[31036]: 2026-02-01 07:40:41.225751705 +0000 UTC m=+0.041389813 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98eadab8befe93d231531e020556caaf786fbb422851022d899eaee01816f7c5/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98eadab8befe93d231531e020556caaf786fbb422851022d899eaee01816f7c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98eadab8befe93d231531e020556caaf786fbb422851022d899eaee01816f7c5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98eadab8befe93d231531e020556caaf786fbb422851022d899eaee01816f7c5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98eadab8befe93d231531e020556caaf786fbb422851022d899eaee01816f7c5/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:41 localhost podman[31036]: 2026-02-01 07:40:41.386339622 +0000 UTC m=+0.201977730 container init c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, distribution-scope=public, GIT_CLEAN=True) Feb 1 02:40:41 localhost podman[31036]: 2026-02-01 07:40:41.396543041 +0000 UTC m=+0.212181139 container start c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.41.4, release=1764794109, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git) Feb 1 02:40:41 localhost podman[31036]: 2026-02-01 07:40:41.396776627 +0000 UTC m=+0.212414775 container attach c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, GIT_BRANCH=main) Feb 1 02:40:41 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test[31051]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Feb 1 02:40:41 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test[31051]: [--no-systemd] [--no-tmpfs] Feb 1 02:40:41 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test[31051]: ceph-volume activate: error: unrecognized arguments: --bad-option Feb 1 02:40:41 localhost systemd[1]: libpod-c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4.scope: Deactivated successfully. Feb 1 02:40:41 localhost podman[31036]: 2026-02-01 07:40:41.642404384 +0000 UTC m=+0.458042522 container died c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1764794109, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, version=7, vcs-type=git, architecture=x86_64, ceph=True) Feb 1 02:40:41 localhost podman[31056]: 2026-02-01 07:40:41.738028673 +0000 UTC m=+0.082639170 container remove c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, release=1764794109, GIT_BRANCH=main, vcs-type=git, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:41 localhost systemd-journald[618]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 1 02:40:41 localhost systemd-journald[618]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 02:40:41 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 02:40:41 localhost systemd[1]: libpod-conmon-c917351a2c5d96c78d548e810f048e4616881fe83a26e7e9453aa108ec06d6c4.scope: Deactivated successfully. Feb 1 02:40:41 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 02:40:42 localhost systemd[1]: Reloading. Feb 1 02:40:42 localhost systemd-rc-local-generator[31113]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:42 localhost systemd-sysv-generator[31118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:42 localhost systemd[1]: var-lib-containers-storage-overlay-98eadab8befe93d231531e020556caaf786fbb422851022d899eaee01816f7c5-merged.mount: Deactivated successfully. Feb 1 02:40:42 localhost systemd[1]: Reloading. Feb 1 02:40:42 localhost systemd-rc-local-generator[31158]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:42 localhost systemd-sysv-generator[31162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:42 localhost systemd[1]: Starting Ceph osd.1 for 33fac0b9-80c7-560f-918a-c92d3021ca1e... Feb 1 02:40:42 localhost podman[31217]: Feb 1 02:40:42 localhost podman[31217]: 2026-02-01 07:40:42.916793763 +0000 UTC m=+0.077879378 container create aac526544f37439642a50dcf5ebeb451343d266dd03d3d6e0dc5ed9a1096f474 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., release=1764794109, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4) Feb 1 02:40:42 localhost systemd[1]: tmp-crun.w9k4xK.mount: Deactivated successfully. Feb 1 02:40:42 localhost systemd[1]: Started libcrun container. Feb 1 02:40:42 localhost podman[31217]: 2026-02-01 07:40:42.88477837 +0000 UTC m=+0.045864015 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/845a00765aff3539090260b6acbc908d4476f6e8d334c085f672c4fad324cea4/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/845a00765aff3539090260b6acbc908d4476f6e8d334c085f672c4fad324cea4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/845a00765aff3539090260b6acbc908d4476f6e8d334c085f672c4fad324cea4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/845a00765aff3539090260b6acbc908d4476f6e8d334c085f672c4fad324cea4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/845a00765aff3539090260b6acbc908d4476f6e8d334c085f672c4fad324cea4/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:43 localhost podman[31217]: 2026-02-01 07:40:43.049220786 +0000 UTC m=+0.210306411 container init aac526544f37439642a50dcf5ebeb451343d266dd03d3d6e0dc5ed9a1096f474 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph) Feb 1 02:40:43 localhost podman[31217]: 2026-02-01 07:40:43.060704767 +0000 UTC m=+0.221790412 container start aac526544f37439642a50dcf5ebeb451343d266dd03d3d6e0dc5ed9a1096f474 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, release=1764794109, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=) Feb 1 02:40:43 localhost podman[31217]: 2026-02-01 07:40:43.061120438 +0000 UTC m=+0.222206063 container attach aac526544f37439642a50dcf5ebeb451343d266dd03d3d6e0dc5ed9a1096f474 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 02:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate[31232]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 Feb 1 02:40:43 localhost bash[31217]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 Feb 1 02:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate[31232]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Feb 1 02:40:43 localhost bash[31217]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-1 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Feb 1 02:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate[31232]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Feb 1 02:40:43 localhost bash[31217]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Feb 1 02:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate[31232]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 1 02:40:43 localhost bash[31217]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 1 02:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate[31232]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:43 localhost bash[31217]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate[31232]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 Feb 1 02:40:43 localhost bash[31217]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 Feb 1 02:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate[31232]: --> ceph-volume raw activate successful for osd ID: 1 Feb 1 02:40:43 localhost bash[31217]: --> ceph-volume raw activate successful for osd ID: 1 Feb 1 02:40:43 localhost systemd[1]: libpod-aac526544f37439642a50dcf5ebeb451343d266dd03d3d6e0dc5ed9a1096f474.scope: Deactivated successfully. Feb 1 02:40:43 localhost podman[31217]: 2026-02-01 07:40:43.790701533 +0000 UTC m=+0.951787158 container died aac526544f37439642a50dcf5ebeb451343d266dd03d3d6e0dc5ed9a1096f474 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, build-date=2025-12-08T17:28:53Z, release=1764794109, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 02:40:43 localhost systemd[1]: tmp-crun.9nWcWv.mount: Deactivated successfully. Feb 1 02:40:43 localhost systemd[1]: var-lib-containers-storage-overlay-845a00765aff3539090260b6acbc908d4476f6e8d334c085f672c4fad324cea4-merged.mount: Deactivated successfully. Feb 1 02:40:43 localhost podman[31351]: 2026-02-01 07:40:43.880110473 +0000 UTC m=+0.079497839 container remove aac526544f37439642a50dcf5ebeb451343d266dd03d3d6e0dc5ed9a1096f474 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1-activate, build-date=2025-12-08T17:28:53Z, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=7, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.buildah.version=1.41.4) Feb 1 02:40:44 localhost podman[31412]: Feb 1 02:40:44 localhost podman[31412]: 2026-02-01 07:40:44.224040016 +0000 UTC m=+0.085604536 container create f04dc1b91f62e6580edf84afe897f274e11c3f31cf835118e096b110324668f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1, architecture=x86_64, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git) Feb 1 02:40:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a2385024733940209a2621364d3a1f61321fbf90250a08c2829c01f4998cfff/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:44 localhost podman[31412]: 2026-02-01 07:40:44.18992237 +0000 UTC m=+0.051486909 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a2385024733940209a2621364d3a1f61321fbf90250a08c2829c01f4998cfff/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a2385024733940209a2621364d3a1f61321fbf90250a08c2829c01f4998cfff/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a2385024733940209a2621364d3a1f61321fbf90250a08c2829c01f4998cfff/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a2385024733940209a2621364d3a1f61321fbf90250a08c2829c01f4998cfff/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:44 localhost podman[31412]: 2026-02-01 07:40:44.343399608 +0000 UTC m=+0.204964127 container init f04dc1b91f62e6580edf84afe897f274e11c3f31cf835118e096b110324668f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, RELEASE=main, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:44 localhost podman[31412]: 2026-02-01 07:40:44.354729565 +0000 UTC m=+0.216294084 container start f04dc1b91f62e6580edf84afe897f274e11c3f31cf835118e096b110324668f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.buildah.version=1.41.4) Feb 1 02:40:44 localhost bash[31412]: f04dc1b91f62e6580edf84afe897f274e11c3f31cf835118e096b110324668f8 Feb 1 02:40:44 localhost systemd[1]: Started Ceph osd.1 for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 02:40:44 localhost ceph-osd[31431]: set uid:gid to 167:167 (ceph:ceph) Feb 1 02:40:44 localhost ceph-osd[31431]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Feb 1 02:40:44 localhost ceph-osd[31431]: pidfile_write: ignore empty --pid-file Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:44 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:44 localhost ceph-osd[31431]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) close Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) close Feb 1 02:40:44 localhost ceph-osd[31431]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal Feb 1 02:40:44 localhost ceph-osd[31431]: load: jerasure load: lrc Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:44 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:44 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) close Feb 1 02:40:45 localhost podman[31522]: Feb 1 02:40:45 localhost podman[31522]: 2026-02-01 07:40:45.187000928 +0000 UTC m=+0.080769332 container create 3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_chebyshev, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.buildah.version=1.41.4, architecture=x86_64, version=7, build-date=2025-12-08T17:28:53Z, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) close Feb 1 02:40:45 localhost systemd[1]: Started libpod-conmon-3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26.scope. Feb 1 02:40:45 localhost systemd[1]: Started libcrun container. Feb 1 02:40:45 localhost podman[31522]: 2026-02-01 07:40:45.15401359 +0000 UTC m=+0.047782024 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:45 localhost podman[31522]: 2026-02-01 07:40:45.2603353 +0000 UTC m=+0.154103704 container init 3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_chebyshev, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1764794109, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 02:40:45 localhost podman[31522]: 2026-02-01 07:40:45.270163359 +0000 UTC m=+0.163931763 container start 3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_chebyshev, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, version=7, RELEASE=main) Feb 1 02:40:45 localhost podman[31522]: 2026-02-01 07:40:45.27057783 +0000 UTC m=+0.164346284 container attach 3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_chebyshev, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, vendor=Red Hat, Inc., release=1764794109, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Feb 1 02:40:45 localhost fervent_chebyshev[31541]: 167 167 Feb 1 02:40:45 localhost systemd[1]: libpod-3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26.scope: Deactivated successfully. Feb 1 02:40:45 localhost podman[31522]: 2026-02-01 07:40:45.27451212 +0000 UTC m=+0.168280574 container died 3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_chebyshev, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, ceph=True, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64) Feb 1 02:40:45 localhost podman[31546]: 2026-02-01 07:40:45.362328369 +0000 UTC m=+0.078549045 container remove 3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_chebyshev, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64) Feb 1 02:40:45 localhost systemd[1]: libpod-conmon-3d86898e649abba4f70ea9a95054cb22b38bddb7f0e2c6a60aa766a2e5f8eb26.scope: Deactivated successfully. Feb 1 02:40:45 localhost ceph-osd[31431]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcce00 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs mount Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs mount shared_bdev_used = 0 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: RocksDB version: 7.9.2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Git sha 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Compile date 2025-09-23 00:00:00 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: DB SUMMARY Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: DB Session ID: 2D8G26JAG7CV1PNWRJ59 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: CURRENT file: CURRENT Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: IDENTITY file: IDENTITY Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.error_if_exists: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.create_if_missing: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.env: 0x561d73060c40 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.fs: LegacyFileSystem Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.info_log: 0x561d73d62860 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_file_opening_threads: 16 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.statistics: (nil) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_fsync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_log_file_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.log_file_time_to_roll: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.keep_log_file_num: 1000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.recycle_log_file_num: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_fallocate: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_mmap_reads: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_mmap_writes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_direct_reads: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.create_missing_column_families: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.db_log_dir: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_dir: db.wal Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_cache_numshardbits: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.advise_random_on_open: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.db_write_buffer_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_manager: 0x561d72db6140 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_adaptive_mutex: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.rate_limiter: (nil) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_recovery_mode: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_thread_tracking: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_pipelined_write: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.unordered_write: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.row_cache: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_ingest_behind: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.two_write_queues: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.manual_wal_flush: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_compression: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.atomic_flush: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.persist_stats_to_disk: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.log_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.best_efforts_recovery: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_data_in_errors: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.db_host_id: __hostname__ Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enforce_single_del_contracts: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_background_jobs: 4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_background_compactions: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_subcompactions: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.delayed_write_rate : 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.stats_dump_period_sec: 600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.stats_persist_period_sec: 600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_open_files: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bytes_per_sync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_background_flushes: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Compression algorithms supported: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kZSTD supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kXpressCompression supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kBZip2Compression supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kLZ4Compression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kZlibCompression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kSnappyCompression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: DMutex implementation: pthread_mutex_t Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62a20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da4850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62a20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da4850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62a20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da4850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62a20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da4850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62a20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da4850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62a20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da4850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62a20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da4850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62c40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62c40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62c40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 64e47f02-2b38-4d1a-8895-e3a810f50214 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645548272, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645548974, "job": 1, "event": "recovery_finished"} Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000 Feb 1 02:40:45 localhost ceph-osd[31431]: freelist init Feb 1 02:40:45 localhost ceph-osd[31431]: freelist _read_cfg Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs umount Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) close Feb 1 02:40:45 localhost podman[31767]: Feb 1 02:40:45 localhost podman[31767]: 2026-02-01 07:40:45.686211344 +0000 UTC m=+0.074226815 container create cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test, io.buildah.version=1.41.4, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_CLEAN=True, RELEASE=main, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:45 localhost systemd[1]: Started libpod-conmon-cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1.scope. Feb 1 02:40:45 localhost systemd[1]: Started libcrun container. Feb 1 02:40:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cb8505aa568ca036823035944225207a4fbb4578a8033a46deb47dfc3b030d5/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:45 localhost podman[31767]: 2026-02-01 07:40:45.657378172 +0000 UTC m=+0.045393693 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cb8505aa568ca036823035944225207a4fbb4578a8033a46deb47dfc3b030d5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cb8505aa568ca036823035944225207a4fbb4578a8033a46deb47dfc3b030d5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cb8505aa568ca036823035944225207a4fbb4578a8033a46deb47dfc3b030d5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cb8505aa568ca036823035944225207a4fbb4578a8033a46deb47dfc3b030d5/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument Feb 1 02:40:45 localhost ceph-osd[31431]: bdev(0x561d72dcd180 /var/lib/ceph/osd/ceph-1/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 7.0 GiB Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs mount Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 1 02:40:45 localhost ceph-osd[31431]: bluefs mount shared_bdev_used = 4718592 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: RocksDB version: 7.9.2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Git sha 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Compile date 2025-09-23 00:00:00 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: DB SUMMARY Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: DB Session ID: 2D8G26JAG7CV1PNWRJ58 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: CURRENT file: CURRENT Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: IDENTITY file: IDENTITY Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.error_if_exists: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.create_if_missing: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.env: 0x561d73be1f80 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.fs: LegacyFileSystem Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.info_log: 0x561d73df2340 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_file_opening_threads: 16 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.statistics: (nil) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_fsync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_log_file_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.log_file_time_to_roll: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.keep_log_file_num: 1000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.recycle_log_file_num: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_fallocate: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_mmap_reads: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_mmap_writes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_direct_reads: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.create_missing_column_families: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.db_log_dir: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_dir: db.wal Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_cache_numshardbits: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.advise_random_on_open: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.db_write_buffer_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_manager: 0x561d72db6140 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.use_adaptive_mutex: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.rate_limiter: (nil) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_recovery_mode: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_thread_tracking: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_pipelined_write: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.unordered_write: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.row_cache: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_ingest_behind: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.two_write_queues: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.manual_wal_flush: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_compression: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.atomic_flush: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.persist_stats_to_disk: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.log_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.best_efforts_recovery: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.allow_data_in_errors: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.db_host_id: __hostname__ Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enforce_single_del_contracts: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_background_jobs: 4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_background_compactions: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_subcompactions: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.delayed_write_rate : 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.stats_dump_period_sec: 600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.stats_persist_period_sec: 600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_open_files: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bytes_per_sync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_background_flushes: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Compression algorithms supported: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kZSTD supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kXpressCompression supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kBZip2Compression supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kLZ4Compression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kZlibCompression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: #011kSnappyCompression supported: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: DMutex implementation: pthread_mutex_t Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73df2560)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73df2560)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73df2560)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73df2560)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost podman[31767]: 2026-02-01 07:40:45.810777346 +0000 UTC m=+0.198792817 container init cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1764794109, architecture=x86_64, vendor=Red Hat, Inc.) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73df2560)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73df2560)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73df2560)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da42d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost podman[31767]: 2026-02-01 07:40:45.821114199 +0000 UTC m=+0.209129660 container start cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test, distribution-scope=public, release=1764794109, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7) Feb 1 02:40:45 localhost podman[31767]: 2026-02-01 07:40:45.821742975 +0000 UTC m=+0.209758506 container attach cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , name=rhceph, version=7, architecture=x86_64, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1764794109, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62e20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da5610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62e20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da5610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.merge_operator: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561d73d62e20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x561d72da5610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression: LZ4 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.num_levels: 7 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 64e47f02-2b38-4d1a-8895-e3a810f50214 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645816946, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645824033, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931645, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "64e47f02-2b38-4d1a-8895-e3a810f50214", "db_session_id": "2D8G26JAG7CV1PNWRJ58", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645829232, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931645, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "64e47f02-2b38-4d1a-8895-e3a810f50214", "db_session_id": "2D8G26JAG7CV1PNWRJ58", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645833433, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931645, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "64e47f02-2b38-4d1a-8895-e3a810f50214", "db_session_id": "2D8G26JAG7CV1PNWRJ58", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645838206, "job": 1, "event": "recovery_finished"} Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Feb 1 02:40:45 localhost systemd[1]: var-lib-containers-storage-overlay-ec1f283cf1f146054850c680c68fcb98f071027f314eb14939cc27279cd6cd0a-merged.mount: Deactivated successfully. Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x561d73dbe700 Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: DB pointer 0x561d73cbfa00 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4 Feb 1 02:40:45 localhost ceph-osd[31431]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 02:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 460.80 MB usag Feb 1 02:40:45 localhost ceph-osd[31431]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Feb 1 02:40:45 localhost ceph-osd[31431]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Feb 1 02:40:45 localhost ceph-osd[31431]: _get_class not permitted to load lua Feb 1 02:40:45 localhost ceph-osd[31431]: _get_class not permitted to load sdk Feb 1 02:40:45 localhost ceph-osd[31431]: _get_class not permitted to load test_remote_reads Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1 0 load_pgs Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1 0 load_pgs opened 0 pgs Feb 1 02:40:45 localhost ceph-osd[31431]: osd.1 0 log_to_monitors true Feb 1 02:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1[31427]: 2026-02-01T07:40:45.879+0000 7f862ae87a80 -1 osd.1 0 log_to_monitors true Feb 1 02:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test[31782]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Feb 1 02:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test[31782]: [--no-systemd] [--no-tmpfs] Feb 1 02:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test[31782]: ceph-volume activate: error: unrecognized arguments: --bad-option Feb 1 02:40:46 localhost systemd[1]: libpod-cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1.scope: Deactivated successfully. Feb 1 02:40:46 localhost podman[31767]: 2026-02-01 07:40:46.050155475 +0000 UTC m=+0.438170996 container died cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, release=1764794109, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.) Feb 1 02:40:46 localhost systemd[1]: var-lib-containers-storage-overlay-0cb8505aa568ca036823035944225207a4fbb4578a8033a46deb47dfc3b030d5-merged.mount: Deactivated successfully. Feb 1 02:40:46 localhost podman[32002]: 2026-02-01 07:40:46.127416146 +0000 UTC m=+0.066789526 container remove cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate-test, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, build-date=2025-12-08T17:28:53Z, name=rhceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=7) Feb 1 02:40:46 localhost systemd[1]: libpod-conmon-cc94e2522f6d8f9295b340210f8dfbed63858838a634155a6c999dae6beb7fc1.scope: Deactivated successfully. Feb 1 02:40:46 localhost systemd[1]: Reloading. Feb 1 02:40:46 localhost systemd-sysv-generator[32058]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:46 localhost systemd-rc-local-generator[32054]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:46 localhost systemd[1]: Reloading. Feb 1 02:40:46 localhost systemd-rc-local-generator[32095]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:40:46 localhost systemd-sysv-generator[32101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:40:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:40:46 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Feb 1 02:40:46 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Feb 1 02:40:46 localhost systemd[1]: Starting Ceph osd.4 for 33fac0b9-80c7-560f-918a-c92d3021ca1e... Feb 1 02:40:47 localhost podman[32161]: Feb 1 02:40:47 localhost podman[32161]: 2026-02-01 07:40:47.263876063 +0000 UTC m=+0.077485467 container create f38d560afc5a79a1b83b92553f84d922a9ce4a219d0996d0e3c02c44f2a96ca9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container) Feb 1 02:40:47 localhost systemd[1]: Started libcrun container. Feb 1 02:40:47 localhost podman[32161]: 2026-02-01 07:40:47.231328278 +0000 UTC m=+0.044937692 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45d0989fa4c51a1d5a17dacbd848bf7832dab20de8473a5516bcdcab7f77ffdc/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45d0989fa4c51a1d5a17dacbd848bf7832dab20de8473a5516bcdcab7f77ffdc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45d0989fa4c51a1d5a17dacbd848bf7832dab20de8473a5516bcdcab7f77ffdc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45d0989fa4c51a1d5a17dacbd848bf7832dab20de8473a5516bcdcab7f77ffdc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45d0989fa4c51a1d5a17dacbd848bf7832dab20de8473a5516bcdcab7f77ffdc/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:47 localhost podman[32161]: 2026-02-01 07:40:47.364100289 +0000 UTC m=+0.177709683 container init f38d560afc5a79a1b83b92553f84d922a9ce4a219d0996d0e3c02c44f2a96ca9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 02:40:47 localhost podman[32161]: 2026-02-01 07:40:47.376159485 +0000 UTC m=+0.189768879 container start f38d560afc5a79a1b83b92553f84d922a9ce4a219d0996d0e3c02c44f2a96ca9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, maintainer=Guillaume Abrioux , vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main) Feb 1 02:40:47 localhost podman[32161]: 2026-02-01 07:40:47.376472852 +0000 UTC m=+0.190082246 container attach f38d560afc5a79a1b83b92553f84d922a9ce4a219d0996d0e3c02c44f2a96ca9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1764794109, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Feb 1 02:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate[32177]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Feb 1 02:40:47 localhost bash[32161]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Feb 1 02:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate[32177]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Feb 1 02:40:47 localhost bash[32161]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-4 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Feb 1 02:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate[32177]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Feb 1 02:40:47 localhost bash[32161]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Feb 1 02:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate[32177]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 1 02:40:47 localhost bash[32161]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 1 02:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate[32177]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:47 localhost bash[32161]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate[32177]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Feb 1 02:40:47 localhost bash[32161]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 Feb 1 02:40:47 localhost ceph-osd[31431]: osd.1 0 done with init, starting boot process Feb 1 02:40:47 localhost ceph-osd[31431]: osd.1 0 start_boot Feb 1 02:40:47 localhost ceph-osd[31431]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1 Feb 1 02:40:47 localhost ceph-osd[31431]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Feb 1 02:40:47 localhost ceph-osd[31431]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Feb 1 02:40:47 localhost ceph-osd[31431]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Feb 1 02:40:47 localhost ceph-osd[31431]: osd.1 0 bench count 12288000 bsize 4 KiB Feb 1 02:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate[32177]: --> ceph-volume raw activate successful for osd ID: 4 Feb 1 02:40:47 localhost bash[32161]: --> ceph-volume raw activate successful for osd ID: 4 Feb 1 02:40:48 localhost systemd[1]: libpod-f38d560afc5a79a1b83b92553f84d922a9ce4a219d0996d0e3c02c44f2a96ca9.scope: Deactivated successfully. Feb 1 02:40:48 localhost podman[32298]: 2026-02-01 07:40:48.074943458 +0000 UTC m=+0.054315390 container died f38d560afc5a79a1b83b92553f84d922a9ce4a219d0996d0e3c02c44f2a96ca9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate, io.buildah.version=1.41.4, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, ceph=True, release=1764794109, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7) Feb 1 02:40:48 localhost podman[32298]: 2026-02-01 07:40:48.111070586 +0000 UTC m=+0.090442478 container remove f38d560afc5a79a1b83b92553f84d922a9ce4a219d0996d0e3c02c44f2a96ca9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, release=1764794109, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 02:40:48 localhost systemd[1]: var-lib-containers-storage-overlay-45d0989fa4c51a1d5a17dacbd848bf7832dab20de8473a5516bcdcab7f77ffdc-merged.mount: Deactivated successfully. Feb 1 02:40:48 localhost podman[32357]: Feb 1 02:40:48 localhost podman[32357]: 2026-02-01 07:40:48.437925015 +0000 UTC m=+0.067894705 container create 07f4ebdc7e77e75f11fc40a73d679dd6bcf06b2f454fe89f5fb5ac29b0427349 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, io.buildah.version=1.41.4, ceph=True, distribution-scope=public, version=7, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main) Feb 1 02:40:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9de77ea99ea36432cec58fe99fc504c82eec472776eecdffda47355494b97f8/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:48 localhost podman[32357]: 2026-02-01 07:40:48.412847528 +0000 UTC m=+0.042817288 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9de77ea99ea36432cec58fe99fc504c82eec472776eecdffda47355494b97f8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9de77ea99ea36432cec58fe99fc504c82eec472776eecdffda47355494b97f8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9de77ea99ea36432cec58fe99fc504c82eec472776eecdffda47355494b97f8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9de77ea99ea36432cec58fe99fc504c82eec472776eecdffda47355494b97f8/merged/var/lib/ceph/osd/ceph-4 supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:48 localhost podman[32357]: 2026-02-01 07:40:48.555601263 +0000 UTC m=+0.185570983 container init 07f4ebdc7e77e75f11fc40a73d679dd6bcf06b2f454fe89f5fb5ac29b0427349 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux ) Feb 1 02:40:48 localhost podman[32357]: 2026-02-01 07:40:48.566864009 +0000 UTC m=+0.196833719 container start 07f4ebdc7e77e75f11fc40a73d679dd6bcf06b2f454fe89f5fb5ac29b0427349 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, name=rhceph, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True) Feb 1 02:40:48 localhost bash[32357]: 07f4ebdc7e77e75f11fc40a73d679dd6bcf06b2f454fe89f5fb5ac29b0427349 Feb 1 02:40:48 localhost systemd[1]: Started Ceph osd.4 for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 02:40:48 localhost ceph-osd[32376]: set uid:gid to 167:167 (ceph:ceph) Feb 1 02:40:48 localhost ceph-osd[32376]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Feb 1 02:40:48 localhost ceph-osd[32376]: pidfile_write: ignore empty --pid-file Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:48 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:48 localhost ceph-osd[32376]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) close Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) close Feb 1 02:40:48 localhost ceph-osd[32376]: starting osd.4 osd_data /var/lib/ceph/osd/ceph-4 /var/lib/ceph/osd/ceph-4/journal Feb 1 02:40:48 localhost ceph-osd[32376]: load: jerasure load: lrc Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:48 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:48 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) close Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) close Feb 1 02:40:49 localhost podman[32470]: Feb 1 02:40:49 localhost podman[32470]: 2026-02-01 07:40:49.404350744 +0000 UTC m=+0.064440227 container create 7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_swirles, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_BRANCH=main, architecture=x86_64, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container) Feb 1 02:40:49 localhost ceph-osd[32376]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Feb 1 02:40:49 localhost systemd[1]: Started libpod-conmon-7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3.scope. Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb58e00 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs mount Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs mount shared_bdev_used = 0 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: RocksDB version: 7.9.2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Git sha 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Compile date 2025-09-23 00:00:00 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: DB SUMMARY Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: DB Session ID: B7D26NIY9QFJIJI45EFV Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: CURRENT file: CURRENT Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: IDENTITY file: IDENTITY Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.error_if_exists: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.create_if_missing: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.env: 0x55976bdeccb0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.fs: LegacyFileSystem Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.info_log: 0x55976cb04b80 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_file_opening_threads: 16 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.statistics: (nil) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_fsync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_log_file_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.log_file_time_to_roll: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.keep_log_file_num: 1000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.recycle_log_file_num: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_fallocate: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_mmap_reads: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_mmap_writes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_direct_reads: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.create_missing_column_families: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.db_log_dir: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_dir: db.wal Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_cache_numshardbits: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.advise_random_on_open: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.db_write_buffer_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_manager: 0x55976bb42140 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_adaptive_mutex: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.rate_limiter: (nil) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_recovery_mode: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_thread_tracking: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_pipelined_write: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.unordered_write: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.row_cache: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_ingest_behind: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.two_write_queues: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.manual_wal_flush: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_compression: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.atomic_flush: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.persist_stats_to_disk: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.log_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.best_efforts_recovery: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_data_in_errors: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.db_host_id: __hostname__ Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enforce_single_del_contracts: true Feb 1 02:40:49 localhost podman[32470]: 2026-02-01 07:40:49.378831946 +0000 UTC m=+0.038921419 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_background_jobs: 4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_background_compactions: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_subcompactions: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.delayed_write_rate : 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.stats_dump_period_sec: 600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.stats_persist_period_sec: 600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_open_files: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bytes_per_sync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_background_flushes: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Compression algorithms supported: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kZSTD supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kXpressCompression supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kBZip2Compression supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kLZ4Compression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kZlibCompression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kSnappyCompression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: DMutex implementation: pthread_mutex_t Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04d40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb30850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04d40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb30850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04d40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb30850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04d40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb30850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04d40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb30850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04d40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb30850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost systemd[1]: Started libcrun container. Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04d40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb30850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04f60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04f60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost podman[32470]: 2026-02-01 07:40:49.505696187 +0000 UTC m=+0.165785680 container init 7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_swirles, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976cb04f60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d7fe0686-554d-4b50-a60a-2d5cb79b5ec8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649487075, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649487353, "job": 1, "event": "recovery_finished"} Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old nid_max 1025 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta old blobid_max 10240 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _open_super_meta min_alloc_size 0x1000 Feb 1 02:40:49 localhost ceph-osd[32376]: freelist init Feb 1 02:40:49 localhost ceph-osd[32376]: freelist _read_cfg Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs umount Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) close Feb 1 02:40:49 localhost podman[32470]: 2026-02-01 07:40:49.515815985 +0000 UTC m=+0.175905468 container start 7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_swirles, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, ceph=True, release=1764794109, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=7, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, name=rhceph) Feb 1 02:40:49 localhost podman[32470]: 2026-02-01 07:40:49.516059431 +0000 UTC m=+0.176148914 container attach 7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_swirles, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph) Feb 1 02:40:49 localhost reverent_swirles[32494]: 167 167 Feb 1 02:40:49 localhost systemd[1]: libpod-7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3.scope: Deactivated successfully. Feb 1 02:40:49 localhost podman[32470]: 2026-02-01 07:40:49.523849039 +0000 UTC m=+0.183938522 container died 7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_swirles, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., ceph=True, GIT_BRANCH=main) Feb 1 02:40:49 localhost podman[32683]: 2026-02-01 07:40:49.634718913 +0000 UTC m=+0.103660813 container remove 7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_swirles, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-12-08T17:28:53Z, RELEASE=main, vendor=Red Hat, Inc., release=1764794109, architecture=x86_64, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Feb 1 02:40:49 localhost systemd[1]: libpod-conmon-7c2ac1e01c805d22cee7b8f09bca234a59ff6260e6735ac78769856ecb1d58a3.scope: Deactivated successfully. Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) open path /var/lib/ceph/osd/ceph-4/block Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-4/block failed: (22) Invalid argument Feb 1 02:40:49 localhost ceph-osd[32376]: bdev(0x55976bb59180 /var/lib/ceph/osd/ceph-4/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-4/block size 7.0 GiB Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs mount Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 1 02:40:49 localhost ceph-osd[32376]: bluefs mount shared_bdev_used = 4718592 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: RocksDB version: 7.9.2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Git sha 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Compile date 2025-09-23 00:00:00 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: DB SUMMARY Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: DB Session ID: B7D26NIY9QFJIJI45EFU Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: CURRENT file: CURRENT Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: IDENTITY file: IDENTITY Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.error_if_exists: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.create_if_missing: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.env: 0x55976bdec8c0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.fs: LegacyFileSystem Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.info_log: 0x55976bbf8420 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_file_opening_threads: 16 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.statistics: (nil) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_fsync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_log_file_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.log_file_time_to_roll: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.keep_log_file_num: 1000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.recycle_log_file_num: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_fallocate: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_mmap_reads: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_mmap_writes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_direct_reads: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.create_missing_column_families: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.db_log_dir: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_dir: db.wal Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_cache_numshardbits: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.advise_random_on_open: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.db_write_buffer_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_manager: 0x55976bb42140 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.use_adaptive_mutex: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.rate_limiter: (nil) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_recovery_mode: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_thread_tracking: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_pipelined_write: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.unordered_write: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.row_cache: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_ingest_behind: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.two_write_queues: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.manual_wal_flush: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_compression: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.atomic_flush: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.persist_stats_to_disk: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.log_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.best_efforts_recovery: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.allow_data_in_errors: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.db_host_id: __hostname__ Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enforce_single_del_contracts: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_background_jobs: 4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_background_compactions: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_subcompactions: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.delayed_write_rate : 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.stats_dump_period_sec: 600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.stats_persist_period_sec: 600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_open_files: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bytes_per_sync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_background_flushes: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Compression algorithms supported: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kZSTD supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kXpressCompression supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kBZip2Compression supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kLZ4Compression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kZlibCompression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: #011kSnappyCompression supported: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: DMutex implementation: pthread_mutex_t Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8640)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8640)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8640)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8640)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8640)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8640)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8640)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb302d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8880)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb31610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8880)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb31610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.merge_operator: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_filter_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.sst_partitioner_factory: None Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55976bbf8880)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55976bb31610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.write_buffer_size: 16777216 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number: 64 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression: LZ4 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression: Disabled Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.num_levels: 7 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.level: 32767 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.enabled: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.arena_block_size: 1048576 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_support: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.bloom_locality: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.max_successive_merges: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.force_consistency_checks: 1 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.ttl: 2592000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_files: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.min_blob_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_size: 268435456 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d7fe0686-554d-4b50-a60a-2d5cb79b5ec8 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649771283, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649780327, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931649, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d7fe0686-554d-4b50-a60a-2d5cb79b5ec8", "db_session_id": "B7D26NIY9QFJIJI45EFU", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649787574, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931649, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d7fe0686-554d-4b50-a60a-2d5cb79b5ec8", "db_session_id": "B7D26NIY9QFJIJI45EFU", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649791142, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931649, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d7fe0686-554d-4b50-a60a-2d5cb79b5ec8", "db_session_id": "B7D26NIY9QFJIJI45EFU", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649797547, "job": 1, "event": "recovery_finished"} Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55976bbf2700 Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: DB pointer 0x55976ca61a00 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super from 4, latest 4 Feb 1 02:40:49 localhost ceph-osd[32376]: bluestore(/var/lib/ceph/osd/ceph-4) _upgrade_super done Feb 1 02:40:49 localhost podman[32855]: Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 02:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012 Feb 1 02:40:49 localhost ceph-osd[32376]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Feb 1 02:40:49 localhost ceph-osd[32376]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Feb 1 02:40:49 localhost ceph-osd[32376]: _get_class not permitted to load lua Feb 1 02:40:49 localhost ceph-osd[32376]: _get_class not permitted to load sdk Feb 1 02:40:49 localhost ceph-osd[32376]: _get_class not permitted to load test_remote_reads Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for clients Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4 0 crush map has features 288232575208783872, adjusting msgr requires for osds Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4 0 load_pgs Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4 0 load_pgs opened 0 pgs Feb 1 02:40:49 localhost ceph-osd[32376]: osd.4 0 log_to_monitors true Feb 1 02:40:49 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4[32372]: 2026-02-01T07:40:49.851+0000 7fcc5a372a80 -1 osd.4 0 log_to_monitors true Feb 1 02:40:49 localhost podman[32855]: 2026-02-01 07:40:49.855412637 +0000 UTC m=+0.077416916 container create faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_khayyam, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Feb 1 02:40:49 localhost systemd[1]: Started libpod-conmon-faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a.scope. Feb 1 02:40:49 localhost systemd[1]: Started libcrun container. Feb 1 02:40:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b06b72ab94c5aaf125c3f1b11d61b9d04f7ecdf42226235879047fbf8ee6359/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b06b72ab94c5aaf125c3f1b11d61b9d04f7ecdf42226235879047fbf8ee6359/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b06b72ab94c5aaf125c3f1b11d61b9d04f7ecdf42226235879047fbf8ee6359/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:49 localhost podman[32855]: 2026-02-01 07:40:49.826319079 +0000 UTC m=+0.048323378 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:49 localhost podman[32855]: 2026-02-01 07:40:49.938310443 +0000 UTC m=+0.160314772 container init faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_khayyam, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.41.4, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 02:40:49 localhost podman[32855]: 2026-02-01 07:40:49.949165408 +0000 UTC m=+0.171169707 container start faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_khayyam, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=1764794109, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_CLEAN=True, vcs-type=git, architecture=x86_64) Feb 1 02:40:49 localhost podman[32855]: 2026-02-01 07:40:49.949440155 +0000 UTC m=+0.171444534 container attach faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_khayyam, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, release=1764794109, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True) Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 33.184 iops: 8495.040 elapsed_sec: 0.353 Feb 1 02:40:50 localhost ceph-osd[31431]: log_channel(cluster) log [WRN] : OSD bench result of 8495.039550 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 0 waiting for initial osdmap Feb 1 02:40:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1[31427]: 2026-02-01T07:40:50.123+0000 7f862761b640 -1 osd.1 0 waiting for initial osdmap Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 12 crush map has features 288514050185494528, adjusting msgr requires for clients Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 12 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 12 crush map has features 3314932999778484224, adjusting msgr requires for osds Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 12 check_osdmap_features require_osd_release unknown -> reef Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 12 set_numa_affinity not setting numa affinity Feb 1 02:40:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-1[31427]: 2026-02-01T07:40:50.139+0000 7f8622430640 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 1 02:40:50 localhost ceph-osd[31431]: osd.1 12 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Feb 1 02:40:50 localhost systemd[1]: tmp-crun.5zwkYN.mount: Deactivated successfully. Feb 1 02:40:50 localhost systemd[1]: var-lib-containers-storage-overlay-220139508cdb36dd65401c6a8fe3b7e7fabe45ebffebc92d21bd98f574c2cfba-merged.mount: Deactivated successfully. Feb 1 02:40:50 localhost sharp_khayyam[32933]: { Feb 1 02:40:50 localhost sharp_khayyam[32933]: "1dcce331-c114-4a5e-9a5b-585d78664562": { Feb 1 02:40:50 localhost sharp_khayyam[32933]: "ceph_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e", Feb 1 02:40:50 localhost sharp_khayyam[32933]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Feb 1 02:40:50 localhost sharp_khayyam[32933]: "osd_id": 4, Feb 1 02:40:50 localhost sharp_khayyam[32933]: "osd_uuid": "1dcce331-c114-4a5e-9a5b-585d78664562", Feb 1 02:40:50 localhost sharp_khayyam[32933]: "type": "bluestore" Feb 1 02:40:50 localhost sharp_khayyam[32933]: }, Feb 1 02:40:50 localhost sharp_khayyam[32933]: "a13401ab-1442-4562-869f-232d5c267bec": { Feb 1 02:40:50 localhost sharp_khayyam[32933]: "ceph_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e", Feb 1 02:40:50 localhost sharp_khayyam[32933]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Feb 1 02:40:50 localhost sharp_khayyam[32933]: "osd_id": 1, Feb 1 02:40:50 localhost sharp_khayyam[32933]: "osd_uuid": "a13401ab-1442-4562-869f-232d5c267bec", Feb 1 02:40:50 localhost sharp_khayyam[32933]: "type": "bluestore" Feb 1 02:40:50 localhost sharp_khayyam[32933]: } Feb 1 02:40:50 localhost sharp_khayyam[32933]: } Feb 1 02:40:50 localhost systemd[1]: libpod-faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a.scope: Deactivated successfully. Feb 1 02:40:50 localhost podman[32855]: 2026-02-01 07:40:50.470730012 +0000 UTC m=+0.692734361 container died faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_khayyam, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, version=7, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:50 localhost systemd[1]: var-lib-containers-storage-overlay-3b06b72ab94c5aaf125c3f1b11d61b9d04f7ecdf42226235879047fbf8ee6359-merged.mount: Deactivated successfully. Feb 1 02:40:50 localhost podman[32970]: 2026-02-01 07:40:50.563118978 +0000 UTC m=+0.083095952 container remove faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sharp_khayyam, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, ceph=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, vcs-type=git, release=1764794109, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux ) Feb 1 02:40:50 localhost systemd[1]: libpod-conmon-faf4ad31e956ff19255ccc0c0310e85a58c47d4b455e83f6389cf2c4f617b62a.scope: Deactivated successfully. Feb 1 02:40:50 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Feb 1 02:40:50 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Feb 1 02:40:51 localhost ceph-osd[32376]: osd.4 0 done with init, starting boot process Feb 1 02:40:51 localhost ceph-osd[32376]: osd.4 0 start_boot Feb 1 02:40:51 localhost ceph-osd[32376]: osd.4 0 maybe_override_options_for_qos osd_max_backfills set to 1 Feb 1 02:40:51 localhost ceph-osd[32376]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Feb 1 02:40:51 localhost ceph-osd[32376]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Feb 1 02:40:51 localhost ceph-osd[32376]: osd.4 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Feb 1 02:40:51 localhost ceph-osd[32376]: osd.4 0 bench count 12288000 bsize 4 KiB Feb 1 02:40:51 localhost ceph-osd[31431]: osd.1 13 state: booting -> active Feb 1 02:40:51 localhost sshd[33031]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:40:52 localhost podman[33100]: 2026-02-01 07:40:52.60521406 +0000 UTC m=+0.094485180 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., release=1764794109, architecture=x86_64) Feb 1 02:40:52 localhost podman[33100]: 2026-02-01 07:40:52.744330223 +0000 UTC m=+0.233601363 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , release=1764794109, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Feb 1 02:40:53 localhost ceph-osd[31431]: osd.1 15 crush map has features 288514051259236352, adjusting msgr requires for clients Feb 1 02:40:53 localhost ceph-osd[31431]: osd.1 15 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Feb 1 02:40:53 localhost ceph-osd[31431]: osd.1 15 crush map has features 3314933000852226048, adjusting msgr requires for osds Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 30.867 iops: 7901.849 elapsed_sec: 0.380 Feb 1 02:40:53 localhost ceph-osd[32376]: log_channel(cluster) log [WRN] : OSD bench result of 7901.849317 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.4. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 0 waiting for initial osdmap Feb 1 02:40:53 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4[32372]: 2026-02-01T07:40:53.403+0000 7fcc562f1640 -1 osd.4 0 waiting for initial osdmap Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 15 crush map has features 288514051259236352, adjusting msgr requires for clients Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 15 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 15 crush map has features 3314933000852226048, adjusting msgr requires for osds Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 15 check_osdmap_features require_osd_release unknown -> reef Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 15 set_numa_affinity not setting numa affinity Feb 1 02:40:53 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-4[32372]: 2026-02-01T07:40:53.421+0000 7fcc5191b640 -1 osd.4 15 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 1 02:40:53 localhost ceph-osd[32376]: osd.4 15 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Feb 1 02:40:54 localhost ceph-osd[32376]: osd.4 16 state: booting -> active Feb 1 02:40:54 localhost podman[33299]: Feb 1 02:40:54 localhost podman[33299]: 2026-02-01 07:40:54.739160165 +0000 UTC m=+0.076502394 container create 74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_curie, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:54 localhost systemd[1]: Started libpod-conmon-74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62.scope. Feb 1 02:40:54 localhost systemd[1]: Started libcrun container. Feb 1 02:40:54 localhost podman[33299]: 2026-02-01 07:40:54.801143099 +0000 UTC m=+0.138485288 container init 74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_curie, release=1764794109, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 02:40:54 localhost podman[33299]: 2026-02-01 07:40:54.709213815 +0000 UTC m=+0.046556044 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:54 localhost podman[33299]: 2026-02-01 07:40:54.819890405 +0000 UTC m=+0.157232634 container start 74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_curie, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True) Feb 1 02:40:54 localhost podman[33299]: 2026-02-01 07:40:54.820263865 +0000 UTC m=+0.157606084 container attach 74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_curie, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 02:40:54 localhost gracious_curie[33314]: 167 167 Feb 1 02:40:54 localhost systemd[1]: libpod-74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62.scope: Deactivated successfully. Feb 1 02:40:54 localhost podman[33299]: 2026-02-01 07:40:54.826564585 +0000 UTC m=+0.163906834 container died 74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_curie, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 02:40:54 localhost systemd[1]: var-lib-containers-storage-overlay-2e266bccd785e60e0e49f2ea6ad9a7811b8f5e9bc2f78fc9133903c5438ad6d3-merged.mount: Deactivated successfully. Feb 1 02:40:54 localhost podman[33319]: 2026-02-01 07:40:54.931934249 +0000 UTC m=+0.091316199 container remove 74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_curie, release=1764794109, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, build-date=2025-12-08T17:28:53Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.component=rhceph-container) Feb 1 02:40:54 localhost systemd[1]: libpod-conmon-74cfd233371419307eda4f77c1dc9b0b79afd8f6e3a1955fdf12e016abd3fc62.scope: Deactivated successfully. Feb 1 02:40:55 localhost podman[33341]: Feb 1 02:40:55 localhost podman[33341]: 2026-02-01 07:40:55.131535339 +0000 UTC m=+0.071905568 container create 29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_ganguly, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, release=1764794109, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 02:40:55 localhost systemd[1]: Started libpod-conmon-29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923.scope. Feb 1 02:40:55 localhost systemd[1]: Started libcrun container. Feb 1 02:40:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7dc55cfaff12b576ec7087c405b12dc57556a92e61dd6da2134b04d75cdd68a/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:55 localhost podman[33341]: 2026-02-01 07:40:55.10323342 +0000 UTC m=+0.043603699 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 02:40:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7dc55cfaff12b576ec7087c405b12dc57556a92e61dd6da2134b04d75cdd68a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7dc55cfaff12b576ec7087c405b12dc57556a92e61dd6da2134b04d75cdd68a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 02:40:55 localhost podman[33341]: 2026-02-01 07:40:55.237421896 +0000 UTC m=+0.177792115 container init 29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_ganguly, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 02:40:55 localhost podman[33341]: 2026-02-01 07:40:55.244514867 +0000 UTC m=+0.184885086 container start 29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_ganguly, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, release=1764794109, version=7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 02:40:55 localhost podman[33341]: 2026-02-01 07:40:55.244861696 +0000 UTC m=+0.185231965 container attach 29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_ganguly, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, name=rhceph, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7) Feb 1 02:40:55 localhost ceph-osd[32376]: osd.4 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=16) [2,4,3] r=1 lpr=16 pi=[15,16)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:40:56 localhost focused_ganguly[33356]: [ Feb 1 02:40:56 localhost focused_ganguly[33356]: { Feb 1 02:40:56 localhost focused_ganguly[33356]: "available": false, Feb 1 02:40:56 localhost focused_ganguly[33356]: "ceph_device": false, Feb 1 02:40:56 localhost focused_ganguly[33356]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 1 02:40:56 localhost focused_ganguly[33356]: "lsm_data": {}, Feb 1 02:40:56 localhost focused_ganguly[33356]: "lvs": [], Feb 1 02:40:56 localhost focused_ganguly[33356]: "path": "/dev/sr0", Feb 1 02:40:56 localhost focused_ganguly[33356]: "rejected_reasons": [ Feb 1 02:40:56 localhost focused_ganguly[33356]: "Insufficient space (<5GB)", Feb 1 02:40:56 localhost focused_ganguly[33356]: "Has a FileSystem" Feb 1 02:40:56 localhost focused_ganguly[33356]: ], Feb 1 02:40:56 localhost focused_ganguly[33356]: "sys_api": { Feb 1 02:40:56 localhost focused_ganguly[33356]: "actuators": null, Feb 1 02:40:56 localhost focused_ganguly[33356]: "device_nodes": "sr0", Feb 1 02:40:56 localhost focused_ganguly[33356]: "human_readable_size": "482.00 KB", Feb 1 02:40:56 localhost focused_ganguly[33356]: "id_bus": "ata", Feb 1 02:40:56 localhost focused_ganguly[33356]: "model": "QEMU DVD-ROM", Feb 1 02:40:56 localhost focused_ganguly[33356]: "nr_requests": "2", Feb 1 02:40:56 localhost focused_ganguly[33356]: "partitions": {}, Feb 1 02:40:56 localhost focused_ganguly[33356]: "path": "/dev/sr0", Feb 1 02:40:56 localhost focused_ganguly[33356]: "removable": "1", Feb 1 02:40:56 localhost focused_ganguly[33356]: "rev": "2.5+", Feb 1 02:40:56 localhost focused_ganguly[33356]: "ro": "0", Feb 1 02:40:56 localhost focused_ganguly[33356]: "rotational": "1", Feb 1 02:40:56 localhost focused_ganguly[33356]: "sas_address": "", Feb 1 02:40:56 localhost focused_ganguly[33356]: "sas_device_handle": "", Feb 1 02:40:56 localhost focused_ganguly[33356]: "scheduler_mode": "mq-deadline", Feb 1 02:40:56 localhost focused_ganguly[33356]: "sectors": 0, Feb 1 02:40:56 localhost focused_ganguly[33356]: "sectorsize": "2048", Feb 1 02:40:56 localhost focused_ganguly[33356]: "size": 493568.0, Feb 1 02:40:56 localhost focused_ganguly[33356]: "support_discard": "0", Feb 1 02:40:56 localhost focused_ganguly[33356]: "type": "disk", Feb 1 02:40:56 localhost focused_ganguly[33356]: "vendor": "QEMU" Feb 1 02:40:56 localhost focused_ganguly[33356]: } Feb 1 02:40:56 localhost focused_ganguly[33356]: } Feb 1 02:40:56 localhost focused_ganguly[33356]: ] Feb 1 02:40:56 localhost systemd[1]: libpod-29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923.scope: Deactivated successfully. Feb 1 02:40:56 localhost podman[33341]: 2026-02-01 07:40:56.101523478 +0000 UTC m=+1.041893727 container died 29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_ganguly, release=1764794109, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 02:40:56 localhost systemd[1]: var-lib-containers-storage-overlay-f7dc55cfaff12b576ec7087c405b12dc57556a92e61dd6da2134b04d75cdd68a-merged.mount: Deactivated successfully. Feb 1 02:40:56 localhost podman[34573]: 2026-02-01 07:40:56.184918906 +0000 UTC m=+0.075222231 container remove 29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_ganguly, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Feb 1 02:40:56 localhost systemd[1]: libpod-conmon-29c5c76e7233d1990b00d579ad16ed8f22212e234691fb3eb04a35b45df90923.scope: Deactivated successfully. Feb 1 02:41:05 localhost systemd[26198]: Starting Mark boot as successful... Feb 1 02:41:05 localhost podman[34704]: 2026-02-01 07:41:05.22853916 +0000 UTC m=+0.091997757 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, RELEASE=main, vcs-type=git, version=7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 02:41:05 localhost systemd[26198]: Finished Mark boot as successful. Feb 1 02:41:05 localhost podman[34704]: 2026-02-01 07:41:05.317744065 +0000 UTC m=+0.181202652 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, architecture=x86_64, release=1764794109, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, distribution-scope=public, build-date=2025-12-08T17:28:53Z, ceph=True, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=) Feb 1 02:42:07 localhost podman[34883]: 2026-02-01 07:42:07.082437998 +0000 UTC m=+0.087661723 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, ceph=True, GIT_CLEAN=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, name=rhceph, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, release=1764794109, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph) Feb 1 02:42:07 localhost podman[34883]: 2026-02-01 07:42:07.2114816 +0000 UTC m=+0.216705355 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, build-date=2025-12-08T17:28:53Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, architecture=x86_64, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container) Feb 1 02:42:18 localhost systemd[1]: session-13.scope: Deactivated successfully. Feb 1 02:42:18 localhost systemd[1]: session-13.scope: Consumed 21.335s CPU time. Feb 1 02:42:18 localhost systemd-logind[759]: Session 13 logged out. Waiting for processes to exit. Feb 1 02:42:18 localhost systemd-logind[759]: Removed session 13. Feb 1 02:43:36 localhost sshd[35103]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:44:19 localhost systemd[26198]: Created slice User Background Tasks Slice. Feb 1 02:44:19 localhost systemd[26198]: Starting Cleanup of User's Temporary Files and Directories... Feb 1 02:44:19 localhost systemd[26198]: Finished Cleanup of User's Temporary Files and Directories. Feb 1 02:45:07 localhost sshd[35182]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:45:43 localhost sshd[35262]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:45:43 localhost systemd-logind[759]: New session 27 of user zuul. Feb 1 02:45:44 localhost systemd[1]: Started Session 27 of User zuul. Feb 1 02:45:44 localhost python3[35310]: ansible-ansible.legacy.ping Invoked with data=pong Feb 1 02:45:45 localhost python3[35355]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 02:45:45 localhost python3[35375]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604212.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 1 02:45:46 localhost python3[35431]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:45:46 localhost python3[35474]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769931946.101436-65900-98447550359338/source _original_basename=tmp7m4h3ksx follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:45:47 localhost python3[35504]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:45:47 localhost python3[35520]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:45:48 localhost python3[35536]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:45:49 localhost python3[35552]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:45:50 localhost python3[35566]: ansible-ping Invoked with data=pong Feb 1 02:46:00 localhost sshd[35568]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:46:01 localhost systemd-logind[759]: New session 28 of user tripleo-admin. Feb 1 02:46:01 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 1 02:46:01 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 1 02:46:01 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 1 02:46:01 localhost systemd[1]: Starting User Manager for UID 1003... Feb 1 02:46:01 localhost systemd[35572]: Queued start job for default target Main User Target. Feb 1 02:46:01 localhost systemd[35572]: Created slice User Application Slice. Feb 1 02:46:01 localhost systemd[35572]: Started Mark boot as successful after the user session has run 2 minutes. Feb 1 02:46:01 localhost systemd[35572]: Started Daily Cleanup of User's Temporary Directories. Feb 1 02:46:01 localhost systemd[35572]: Reached target Paths. Feb 1 02:46:01 localhost systemd[35572]: Reached target Timers. Feb 1 02:46:01 localhost systemd[35572]: Starting D-Bus User Message Bus Socket... Feb 1 02:46:01 localhost systemd[35572]: Starting Create User's Volatile Files and Directories... Feb 1 02:46:01 localhost systemd[35572]: Listening on D-Bus User Message Bus Socket. Feb 1 02:46:01 localhost systemd[35572]: Reached target Sockets. Feb 1 02:46:01 localhost systemd[35572]: Finished Create User's Volatile Files and Directories. Feb 1 02:46:01 localhost systemd[35572]: Reached target Basic System. Feb 1 02:46:01 localhost systemd[35572]: Reached target Main User Target. Feb 1 02:46:01 localhost systemd[35572]: Startup finished in 122ms. Feb 1 02:46:01 localhost systemd[1]: Started User Manager for UID 1003. Feb 1 02:46:01 localhost systemd[1]: Started Session 28 of User tripleo-admin. Feb 1 02:46:01 localhost python3[35634]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Feb 1 02:46:06 localhost python3[35654]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Feb 1 02:46:07 localhost python3[35670]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Feb 1 02:46:08 localhost python3[35718]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.wqabhl90tmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:46:08 localhost python3[35748]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.wqabhl90tmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:46:09 localhost python3[35764]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.wqabhl90tmphosts insertbefore=BOF block=172.17.0.106 np0005604212.localdomain np0005604212#012172.18.0.106 np0005604212.storage.localdomain np0005604212.storage#012172.20.0.106 np0005604212.storagemgmt.localdomain np0005604212.storagemgmt#012172.17.0.106 np0005604212.internalapi.localdomain np0005604212.internalapi#012172.19.0.106 np0005604212.tenant.localdomain np0005604212.tenant#012192.168.122.106 np0005604212.ctlplane.localdomain np0005604212.ctlplane#012172.17.0.107 np0005604213.localdomain np0005604213#012172.18.0.107 np0005604213.storage.localdomain np0005604213.storage#012172.20.0.107 np0005604213.storagemgmt.localdomain np0005604213.storagemgmt#012172.17.0.107 np0005604213.internalapi.localdomain np0005604213.internalapi#012172.19.0.107 np0005604213.tenant.localdomain np0005604213.tenant#012192.168.122.107 np0005604213.ctlplane.localdomain np0005604213.ctlplane#012172.17.0.108 np0005604215.localdomain np0005604215#012172.18.0.108 np0005604215.storage.localdomain np0005604215.storage#012172.20.0.108 np0005604215.storagemgmt.localdomain np0005604215.storagemgmt#012172.17.0.108 np0005604215.internalapi.localdomain np0005604215.internalapi#012172.19.0.108 np0005604215.tenant.localdomain np0005604215.tenant#012192.168.122.108 np0005604215.ctlplane.localdomain np0005604215.ctlplane#012172.17.0.103 np0005604209.localdomain np0005604209#012172.18.0.103 np0005604209.storage.localdomain np0005604209.storage#012172.20.0.103 np0005604209.storagemgmt.localdomain np0005604209.storagemgmt#012172.17.0.103 np0005604209.internalapi.localdomain np0005604209.internalapi#012172.19.0.103 np0005604209.tenant.localdomain np0005604209.tenant#012192.168.122.103 np0005604209.ctlplane.localdomain np0005604209.ctlplane#012172.17.0.104 np0005604210.localdomain np0005604210#012172.18.0.104 np0005604210.storage.localdomain np0005604210.storage#012172.20.0.104 np0005604210.storagemgmt.localdomain np0005604210.storagemgmt#012172.17.0.104 np0005604210.internalapi.localdomain np0005604210.internalapi#012172.19.0.104 np0005604210.tenant.localdomain np0005604210.tenant#012192.168.122.104 np0005604210.ctlplane.localdomain np0005604210.ctlplane#012172.17.0.105 np0005604211.localdomain np0005604211#012172.18.0.105 np0005604211.storage.localdomain np0005604211.storage#012172.20.0.105 np0005604211.storagemgmt.localdomain np0005604211.storagemgmt#012172.17.0.105 np0005604211.internalapi.localdomain np0005604211.internalapi#012172.19.0.105 np0005604211.tenant.localdomain np0005604211.tenant#012192.168.122.105 np0005604211.ctlplane.localdomain np0005604211.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.154 overcloud.storage.localdomain#012172.20.0.122 overcloud.storagemgmt.localdomain#012172.17.0.228 overcloud.internalapi.localdomain#012172.21.0.164 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:46:10 localhost python3[35780]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.wqabhl90tmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:46:10 localhost python3[35797]: ansible-file Invoked with path=/tmp/ansible.wqabhl90tmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:46:11 localhost python3[35813]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:46:12 localhost python3[35830]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:46:17 localhost python3[35926]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:46:17 localhost python3[35943]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:46:21 localhost sshd[35948]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:47:27 localhost kernel: SELinux: Converting 2700 SID table entries... Feb 1 02:47:27 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 02:47:27 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 02:47:27 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 02:47:27 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 02:47:27 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 02:47:27 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 02:47:27 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 02:47:28 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=6 res=1 Feb 1 02:47:28 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:47:28 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 02:47:28 localhost systemd[1]: Reloading. Feb 1 02:47:28 localhost systemd-sysv-generator[36789]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:47:28 localhost systemd-rc-local-generator[36786]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:47:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:47:28 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 02:47:29 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 02:47:29 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 02:47:29 localhost systemd[1]: run-r4f21d029129e404e94d83b043f3dc81b.service: Deactivated successfully. Feb 1 02:47:33 localhost python3[37232]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:35 localhost python3[37371]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:47:35 localhost systemd[1]: Reloading. Feb 1 02:47:35 localhost systemd-rc-local-generator[37398]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:47:35 localhost systemd-sysv-generator[37404]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:47:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:47:37 localhost python3[37426]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:47:37 localhost python3[37442]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:38 localhost python3[37459]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 1 02:47:38 localhost python3[37477]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:47:39 localhost python3[37495]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:47:39 localhost python3[37513]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 02:47:40 localhost systemd[1]: Reloading Network Manager... Feb 1 02:47:40 localhost NetworkManager[5964]: [1769932060.7113] audit: op="reload" arg="0" pid=37516 uid=0 result="success" Feb 1 02:47:40 localhost NetworkManager[5964]: [1769932060.7126] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Feb 1 02:47:40 localhost NetworkManager[5964]: [1769932060.7126] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Feb 1 02:47:40 localhost systemd[1]: Reloaded Network Manager. Feb 1 02:47:41 localhost python3[37532]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:41 localhost python3[37549]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:47:41 localhost python3[37567]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:47:42 localhost python3[37583]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:47:43 localhost python3[37599]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Feb 1 02:47:43 localhost python3[37615]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:47:44 localhost python3[37631]: ansible-blockinfile Invoked with path=/tmp/ansible.z4wodao3 block=[192.168.122.106]*,[np0005604212.ctlplane.localdomain]*,[172.17.0.106]*,[np0005604212.internalapi.localdomain]*,[172.18.0.106]*,[np0005604212.storage.localdomain]*,[172.20.0.106]*,[np0005604212.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005604212.tenant.localdomain]*,[np0005604212.localdomain]*,[np0005604212]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCx/MKX//74FswFkw1c1lfM5mahSRoD4B8bhCZSm2/IQ//syuq+Qpi1sEoMv/N1mOrU8atXNtYkVNozl/ypDe2YJkUS8OTt37bT9A7XnBlfFSc5OwXS7VGHpVWbiMbImJibSV7HjoQP0yA8SvCJCcrI3Eh14+cna8tT1rJ9lOFRHvxLfG52XnzFiNUVDU+TG3uRtWEjY5epI8j/U73tEqdP4OAk7ZQ9riN1nllCCIs9FOErOEw14VW+151TbOCzcm9kvzeQMit9jPXTGqmTPKoidZFLhJwEAXq4M9+DFfKQWkVSqfcU3cvPz6S03lUcpPWiJxgGZiIPXxCdRjvI3bKCm898lFYwZq8EfdAwUFMyhmz4GHSyhMwqZWE46cikXf/skoSrEF8ji3NjmyQL7T304iKenZca6rHDI56veO0+PTzZj/pBiaWBWXlqF0WQLAn804z3yapsLNuR8R4EaREmk1Tc2ESg1//73pCUypwEMQWESHsAJ/LCHhyqNHY6Bjc=#012[192.168.122.107]*,[np0005604213.ctlplane.localdomain]*,[172.17.0.107]*,[np0005604213.internalapi.localdomain]*,[172.18.0.107]*,[np0005604213.storage.localdomain]*,[172.20.0.107]*,[np0005604213.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005604213.tenant.localdomain]*,[np0005604213.localdomain]*,[np0005604213]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDhh44DuXnO4hBZJvT1vLnO8ZhT8GKLkBI0M+Q/lXSbHymnCyNerLMqVRhTb5ZUw07lkP6FtBJS95SUtdJuAbUi4jphShtJfBdicoa+uGqI1icHUQCbtCAACtas0lGeGi5q/q1LfzeuKh+LTRj60W+r2OZoChKxeSWYBQ8gIScKe1HgVCJVEESXwNv4CBs6ffOWVYHE+3JDUA3AN3nX931xw4oLMBkwi0q4sNh9Sb0oS79OX+dKdlGfnPLLWKF9QrLrHYdHVkKtPre9d1BdNkl38gRE45uwrAAxXBfeZjbzzfbUlWb54SZwL8P2ej29L5VAbE/97j1HD6+kUZ5wFb6v9oJyFwq8udFDqO1SUMkW4t1VmwD5G4rIU2+u0yHd4H7//fgbf8WAhPv1Qx5tXEqB6LIHqYCz7RekNQO5Xv8ge/gVMzzlxB0DJP6a4DJ8E0/Djnyzw81L2fmyeriPLqt/n/wHscNr1RRI4T1X2iINRwk5QfrxwTEHhJ00FY1kB90=#012[192.168.122.108]*,[np0005604215.ctlplane.localdomain]*,[172.17.0.108]*,[np0005604215.internalapi.localdomain]*,[172.18.0.108]*,[np0005604215.storage.localdomain]*,[172.20.0.108]*,[np0005604215.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005604215.tenant.localdomain]*,[np0005604215.localdomain]*,[np0005604215]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/jKlZ/vxfazmNjpekfENGpQi8TTD6ErYy0BH9P8CRIiiKVdA/53XGSAQlY17b4tT5hzyHsUuXDmbv5R98FSy/Fi8F4KrjgogVPhd/zYoMrffr9ydwv+ih2mIyCPjZC+N92i92gM2OBHBXj5vqyh5yl1t4H1LhFab7P/m42K75mcTytGvGTLKXZbcs/1Ot/APGrs5wqg/c9XFQtgBEn6ttSKQ9caqbgUw88VGRkzaHvzheQvtIjZL0AwigTS24tqFx+bF+liSnSaYk1R8TKe1yMNODv5OCUmFYvPqls4Y3AQkpuroQQXHcQCe0QPuz9nGgPebNOxyTHsK66oDWIUskoYIbrZZhjDxlpdzJ+POEU/jXtGox0/0wlpRK7jNN6r4Fzx6uIzxB5SWn/UJ4BYS853pUsC32TeD0pZXfUAzOGUOzQfvYkUCElyRi8zDN4ubwEWnxvCEPaAFihafbviqQwLNFFmth36owDHV2zU/Q/BtW8vrwfx0cPr2A4WvQvp8=#012[192.168.122.103]*,[np0005604209.ctlplane.localdomain]*,[172.17.0.103]*,[np0005604209.internalapi.localdomain]*,[172.18.0.103]*,[np0005604209.storage.localdomain]*,[172.20.0.103]*,[np0005604209.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005604209.tenant.localdomain]*,[np0005604209.localdomain]*,[np0005604209]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAdXF2/8XBq3bWgr/9swIkzjlkm7PzpC1vdYXglaExGeIUwK5n05/HLobUMrYjOh6yE81+tctBT51wuPLw9qOGf4X3lRx3x0AHUqWSs00OL5nZsMRAd6PknZVyeCWf9jv13mVWIExCYbP8e4VK4M3w1m2xSLFd1aHtGkEUYJKCmacxrxFu2opq+kNCclpMC0BlFeSeX/NZeGwcfVCEyP46JVB9pNDo6D4s98FzzQNtG4DTv8NqE0S8Fj44dajq/80IKXeVEbhVmBikwFGMMEHhsRass2m0Q0rBw1Cv2jqW9hrTO1AWHY2aNDDqr6cKttP27XKfc/unDFFDb0mcc/HRa8JAUYEvuO0FIV6n28+Q5hWoYHAZfMU15U/bQPN1UxbF/MmSIZWvwY+vzCJ+icSJ9qfhDfbd1DttRuV0F3Jdi0jq01TyyPdOz8qT7kKSftD3Awn6BNLlseR8MaOTS+YF4fOnSP/xzj0B+nx/nr5Mrq8+QzKb2YyqdMfWWMGdCw8=#012[192.168.122.104]*,[np0005604210.ctlplane.localdomain]*,[172.17.0.104]*,[np0005604210.internalapi.localdomain]*,[172.18.0.104]*,[np0005604210.storage.localdomain]*,[172.20.0.104]*,[np0005604210.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005604210.tenant.localdomain]*,[np0005604210.localdomain]*,[np0005604210]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDeVlqpmEgZX6yoZkE7SzVbEM6MqJe/9qDZPPgFZPb/N85k+uB3cINsoq0pMJYeKjcKY8H56WyuNkVVwVHaouZnJCN4p1rCJmATIDieU8QMDwGucQpbrNRrQWheWQDkmHNIPOxnUDCRgEzDfYiaE4prLHMPKtf8XJAKUKVd6lpZrVSCovGz0UC3U1Le/0N1PJOi4kYEuipVrcfoYHC63A32I+w+7tybU8Rpknhc/UHhdn39PBGuAhbkSf2JEJbLLzLaPkZXT6HOPiBUT9jWKnymCGEcfPjIWOkeelx3fkPoXZCtnYHlSoQSkCVsUmXgHNj7X3+6sJi9+iV/+8jRWQyk6aCC+HjXDhSwxbBUaM9AOimJ9EK7vo8/IK9pQ3gNsEct6rHuvGytACNMWpaT5sRRaVEnS8uz/PL8urB6+59GYGunjAaw8lCQcxw+VNVJaLtj+BpVJZA2EA6XE4fwq7v0s9u0ApIMSyV3DcYzIcDFlT11I5g3RM8vZNipXfnub3U=#012[192.168.122.105]*,[np0005604211.ctlplane.localdomain]*,[172.17.0.105]*,[np0005604211.internalapi.localdomain]*,[172.18.0.105]*,[np0005604211.storage.localdomain]*,[172.20.0.105]*,[np0005604211.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005604211.tenant.localdomain]*,[np0005604211.localdomain]*,[np0005604211]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCQ5JUOdiESLpaYomijw3u9LxHN4VxpmenW9EczyVvVdofuEESAIR1Q8BIVkW7gxgVyrzHxOpbaoAS+aZaKazruu7/chC8MkDw1lvfeyQwMZax6UziUan2wIFVTaCc7kITOHrdWkJm+OIvCs/ImtkSgsTmvTiQedvs86ME3gHNyA+7taoDXnH6UCB6d5ex6PzwXsKI03iUVWFfsGP3ZU7r52IBwgrLG+VplbaPBRNNP/RvKULVsokG3UCMd3pjHv3VYBdXPYTFOPf666ZEuxEz+Frz43oXzEhr4W61RN70cAFJDDFoOmBDxXzZqrmF7r1vSV3ojl+aHaVLCGL4Wnjrp9wl5Zq8XCGN/7ttzaZKrjj/flccfBEiYL9odgqp92EjmxsRqG4bFq/nEzS/DTJ88QQVpGQNC2T6bElJVdBIrpZAyv7n5HlwNQwfsltQtzbqe1E32azZb1wq13ajV9Ii7QrVd81nGYFM79NqiVVbXs5NypsJOMQ6ZoqyHK5+yyHk=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:47:44 localhost python3[37647]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.z4wodao3' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:45 localhost python3[37665]: ansible-file Invoked with path=/tmp/ansible.z4wodao3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:47:46 localhost python3[37681]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 02:47:46 localhost python3[37697]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:46 localhost python3[37715]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:47 localhost python3[37734]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Feb 1 02:47:49 localhost python3[37871]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:50 localhost python3[37888]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:47:53 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 1 02:47:53 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 1 02:47:53 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:47:53 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 02:47:53 localhost systemd[1]: Reloading. Feb 1 02:47:53 localhost systemd-rc-local-generator[37954]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:47:53 localhost systemd-sysv-generator[37960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:47:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:47:53 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 02:47:53 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 1 02:47:53 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 1 02:47:53 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 1 02:47:53 localhost systemd[1]: tuned.service: Consumed 1.688s CPU time. Feb 1 02:47:53 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 1 02:47:54 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 02:47:54 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 02:47:54 localhost systemd[1]: run-r162dfa22ae4e4e6596d48e1979337002.service: Deactivated successfully. Feb 1 02:47:55 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 1 02:47:55 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:47:55 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 02:47:55 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 02:47:55 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 02:47:55 localhost systemd[1]: run-r5b8090ee64614ff99f9f308515e5a3ec.service: Deactivated successfully. Feb 1 02:47:56 localhost python3[38325]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:47:56 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 1 02:47:56 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 1 02:47:56 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 1 02:47:56 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 1 02:47:58 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 1 02:47:58 localhost python3[38521]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:47:59 localhost python3[38538]: ansible-slurp Invoked with src=/etc/tuned/active_profile Feb 1 02:47:59 localhost python3[38554]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:48:00 localhost python3[38570]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:02 localhost python3[38590]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:03 localhost python3[38607]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:48:05 localhost python3[38623]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:10 localhost python3[38639]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:11 localhost python3[38687]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:11 localhost python3[38732]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932090.8224258-70658-170965706403500/source _original_basename=tmpnc454qx1 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:11 localhost python3[38762]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:12 localhost python3[38810]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:12 localhost python3[38853]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932092.3400419-70751-33321726172943/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=d0cfa4bd89bcc42c9513572d4ad38f679529236d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:13 localhost python3[38915]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:13 localhost python3[38958]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932093.289686-70811-267230907498818/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=1f1a1c0de88e28e1c405f8e299af3f6bf8624260 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:14 localhost python3[39020]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:14 localhost python3[39063]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932094.1133945-70811-90614517170286/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=a14c85776d2e39c2e9398053dff459a83e663446 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:15 localhost python3[39125]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:15 localhost python3[39168]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932094.9839377-70811-28101984973958/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=68b5a56a66cb10764ef3288009ad5e9b7e8faf12 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:16 localhost python3[39230]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:16 localhost python3[39303]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932095.9374018-70811-259636435688547/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:17 localhost python3[39397]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:17 localhost python3[39440]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932096.7618823-70811-242724634232983/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=fbd53f4e851d144cdb38e5828c7a4f4bae8bb106 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:17 localhost python3[39517]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:18 localhost python3[39560]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932097.6542022-70811-149272651416804/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:18 localhost python3[39622]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:19 localhost python3[39665]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932098.5176032-70811-114081171189060/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=b186a8b61d7f8cda474e1db6d9f709185a517ec4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:19 localhost systemd[35572]: Starting Mark boot as successful... Feb 1 02:48:19 localhost systemd[35572]: Finished Mark boot as successful. Feb 1 02:48:19 localhost python3[39727]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:20 localhost python3[39771]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932099.395221-70811-37505619239088/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:20 localhost python3[39833]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:21 localhost python3[39876]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932100.2939346-70811-156419757217800/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:21 localhost python3[39938]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:21 localhost python3[39981]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932101.193497-70811-198098602424297/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=4424aaa3b68a116442c4d8c58838d901e539f369 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:22 localhost python3[40011]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:48:23 localhost python3[40059]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:48:23 localhost python3[40102]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932103.0454836-71628-177287225097379/source _original_basename=tmpb_hkghyx follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:48:28 localhost python3[40132]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 02:48:28 localhost python3[40193]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:33 localhost python3[40210]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:38 localhost python3[40227]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:39 localhost python3[40250]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:39 localhost python3[40273]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:40 localhost python3[40296]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:48:41 localhost python3[40319]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:49:09 localhost sshd[40327]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:49:22 localhost python3[40420]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:22 localhost python3[40468]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:22 localhost python3[40486]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmp6xxqd6q0 recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:23 localhost python3[40516]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:24 localhost python3[40564]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:24 localhost python3[40582]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:24 localhost python3[40644]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:25 localhost python3[40662]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:25 localhost python3[40724]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:26 localhost python3[40742]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:26 localhost python3[40804]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:26 localhost python3[40822]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:27 localhost python3[40884]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:27 localhost python3[40902]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:28 localhost python3[40964]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:28 localhost python3[40982]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:28 localhost python3[41044]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:29 localhost python3[41062]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:29 localhost python3[41124]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:29 localhost python3[41142]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:30 localhost python3[41204]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:30 localhost python3[41222]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:31 localhost python3[41284]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:31 localhost python3[41302]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:31 localhost python3[41364]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:32 localhost python3[41382]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:32 localhost python3[41412]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:49:33 localhost python3[41460]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:33 localhost python3[41478]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpc6lmpkkf recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:36 localhost python3[41508]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:49:40 localhost python3[41525]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:49:41 localhost python3[41543]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:49:42 localhost python3[41561]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:49:43 localhost systemd[1]: Reloading. Feb 1 02:49:43 localhost systemd-sysv-generator[41594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:49:43 localhost systemd-rc-local-generator[41591]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:49:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:49:43 localhost systemd[1]: Starting Netfilter Tables... Feb 1 02:49:43 localhost systemd[1]: Finished Netfilter Tables. Feb 1 02:49:44 localhost python3[41651]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:44 localhost python3[41694]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932183.7601511-74418-214933738840930/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:44 localhost python3[41724]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:49:45 localhost python3[41742]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:49:45 localhost python3[41791]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:46 localhost python3[41834]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932185.6194124-74533-37203706481123/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:47 localhost python3[41896]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:47 localhost python3[41939]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932186.5925312-74593-69626904447346/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:47 localhost python3[42001]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:48 localhost python3[42044]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932187.6184082-74821-230289222203812/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:48 localhost python3[42106]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:49 localhost python3[42149]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932188.5510476-74899-127648450042222/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:50 localhost python3[42211]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:49:50 localhost python3[42254]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932189.4782114-74939-183599764312917/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:50 localhost python3[42284]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:49:51 localhost python3[42349]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:49:52 localhost python3[42366]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:49:52 localhost python3[42383]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:49:52 localhost python3[42402]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:49:53 localhost python3[42418]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:49:53 localhost python3[42434]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:49:53 localhost python3[42450]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 1 02:49:54 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=7 res=1 Feb 1 02:49:54 localhost python3[42470]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 1 02:49:55 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 1 02:49:55 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 02:49:55 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 02:49:55 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 02:49:55 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 02:49:55 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 02:49:55 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 02:49:55 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 02:49:56 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=8 res=1 Feb 1 02:49:56 localhost python3[42494]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 1 02:49:57 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 1 02:49:57 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 02:49:57 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 02:49:57 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 02:49:57 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 02:49:57 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 02:49:57 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 02:49:57 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 02:49:57 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=9 res=1 Feb 1 02:49:57 localhost python3[42515]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 1 02:49:58 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 1 02:49:58 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 02:49:58 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 02:49:58 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 02:49:58 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 02:49:58 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 02:49:58 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 02:49:58 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 02:49:58 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=10 res=1 Feb 1 02:49:59 localhost python3[42536]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:49:59 localhost python3[42552]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:49:59 localhost python3[42568]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:00 localhost python3[42584]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:50:00 localhost python3[42600]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:01 localhost python3[42617]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:50:05 localhost python3[42634]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:05 localhost python3[42682]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:06 localhost python3[42725]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932205.494178-75727-225660795311492/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:06 localhost python3[42755]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 02:50:06 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 1 02:50:06 localhost systemd[1]: Stopped Load Kernel Modules. Feb 1 02:50:06 localhost systemd[1]: Stopping Load Kernel Modules... Feb 1 02:50:06 localhost systemd[1]: Starting Load Kernel Modules... Feb 1 02:50:06 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 1 02:50:06 localhost kernel: Bridge firewalling registered Feb 1 02:50:06 localhost systemd-modules-load[42758]: Inserted module 'br_netfilter' Feb 1 02:50:06 localhost systemd-modules-load[42758]: Module 'msr' is built in Feb 1 02:50:06 localhost systemd[1]: Finished Load Kernel Modules. Feb 1 02:50:07 localhost python3[42809]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:07 localhost python3[42852]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932207.047764-75877-151357617468705/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:08 localhost python3[42882]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:08 localhost python3[42899]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:08 localhost python3[42917]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:08 localhost python3[42935]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:09 localhost python3[42952]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:09 localhost python3[42969]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:09 localhost python3[42986]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:10 localhost python3[43004]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:10 localhost python3[43022]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:10 localhost python3[43040]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:11 localhost python3[43058]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:11 localhost python3[43076]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:11 localhost python3[43094]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:12 localhost python3[43112]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:12 localhost python3[43129]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:12 localhost python3[43146]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:13 localhost python3[43163]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:13 localhost python3[43180]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 1 02:50:13 localhost python3[43198]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 02:50:13 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 1 02:50:13 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 1 02:50:13 localhost systemd[1]: Stopping Apply Kernel Variables... Feb 1 02:50:13 localhost systemd[1]: Starting Apply Kernel Variables... Feb 1 02:50:14 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 1 02:50:14 localhost systemd[1]: Finished Apply Kernel Variables. Feb 1 02:50:14 localhost python3[43218]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:14 localhost python3[43234]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:15 localhost python3[43250]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:15 localhost python3[43266]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:50:15 localhost python3[43282]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:16 localhost python3[43298]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:16 localhost python3[43314]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:16 localhost python3[43330]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:17 localhost python3[43346]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:17 localhost python3[43394]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:17 localhost python3[43437]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932217.263749-76148-7573670578534/source _original_basename=tmparmy0gx1 follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:18 localhost python3[43467]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:19 localhost python3[43484]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:19 localhost python3[43532]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:20 localhost python3[43575]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932219.618513-76516-111391905456786/source _original_basename=tmp86cuiii8 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:20 localhost python3[43605]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:21 localhost python3[43651]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:21 localhost python3[43688]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:21 localhost python3[43733]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:22 localhost python3[43750]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:22 localhost python3[43797]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:22 localhost python3[43813]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:22 localhost python3[43844]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:23 localhost python3[43860]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:23 localhost python3[43876]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Feb 1 02:50:24 localhost python3[43898]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604212.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 1 02:50:24 localhost python3[43922]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Feb 1 02:50:24 localhost python3[43938]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:25 localhost python3[43987]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:25 localhost python3[44030]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932224.9739165-76763-28122128174310/source _original_basename=tmpcuy3ekek follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:26 localhost python3[44060]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Feb 1 02:50:26 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=11 res=1 Feb 1 02:50:27 localhost python3[44080]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:27 localhost python3[44096]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:27 localhost python3[44112]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Feb 1 02:50:29 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=12 res=1 Feb 1 02:50:29 localhost python3[44132]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:50:32 localhost python3[44149]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 02:50:33 localhost python3[44210]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:34 localhost python3[44226]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:34 localhost python3[44286]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:35 localhost python3[44329]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932234.3699625-77173-91628623265075/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=c32c4dc9916070c3326a45e554160c9ab0af1065 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:35 localhost python3[44391]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:36 localhost python3[44436]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932235.4393737-77224-18626881587807/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:36 localhost python3[44466]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:36 localhost python3[44482]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:37 localhost python3[44498]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:37 localhost python3[44514]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:38 localhost python3[44562]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:38 localhost python3[44605]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932238.0115728-77429-158137394449359/source _original_basename=tmpith6y64p follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:39 localhost python3[44635]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:39 localhost python3[44651]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:40 localhost python3[44667]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:50:44 localhost python3[44716]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:44 localhost python3[44761]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932243.784346-77665-140765426398887/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:45 localhost python3[44792]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:50:45 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 1 02:50:45 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 1 02:50:45 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 1 02:50:45 localhost systemd[1]: sshd.service: Consumed 2.362s CPU time, read 1.9M from disk, written 48.0K to disk. Feb 1 02:50:45 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 1 02:50:45 localhost systemd[1]: Stopping sshd-keygen.target... Feb 1 02:50:45 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 02:50:45 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 02:50:45 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 02:50:45 localhost systemd[1]: Reached target sshd-keygen.target. Feb 1 02:50:45 localhost systemd[1]: Starting OpenSSH server daemon... Feb 1 02:50:45 localhost sshd[44796]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:50:45 localhost systemd[1]: Started OpenSSH server daemon. Feb 1 02:50:45 localhost python3[44812]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 02:50:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3253 writes, 16K keys, 3253 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3253 writes, 142 syncs, 22.91 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3253 writes, 16K keys, 3253 commit groups, 1.0 writes per commit group, ingest: 14.65 MB, 0.02 MB/s#012Interval WAL: 3253 writes, 142 syncs, 22.91 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Feb 1 02:50:46 localhost python3[44830]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:46 localhost python3[44848]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:50:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 02:50:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3384 writes, 16K keys, 3384 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s#012Cumulative WAL: 3384 writes, 195 syncs, 17.35 writes per sync, written: 0.01 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3384 writes, 16K keys, 3384 commit groups, 1.0 writes per commit group, ingest: 15.26 MB, 0.03 MB/s#012Interval WAL: 3384 writes, 195 syncs, 17.35 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Feb 1 02:50:50 localhost python3[44897]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:51 localhost python3[44915]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:51 localhost python3[44945]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:50:52 localhost python3[44995]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:50:52 localhost python3[45013]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:50:53 localhost python3[45043]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:50:53 localhost systemd[1]: Reloading. Feb 1 02:50:53 localhost systemd-rc-local-generator[45071]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:50:53 localhost systemd-sysv-generator[45074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:50:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:50:53 localhost systemd[1]: Starting chronyd online sources service... Feb 1 02:50:53 localhost chronyc[45083]: 200 OK Feb 1 02:50:53 localhost systemd[1]: chrony-online.service: Deactivated successfully. Feb 1 02:50:53 localhost systemd[1]: Finished chronyd online sources service. Feb 1 02:50:54 localhost python3[45099]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:54 localhost chronyd[25994]: System clock was stepped by 0.000036 seconds Feb 1 02:50:54 localhost python3[45116]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:54 localhost python3[45133]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:54 localhost chronyd[25994]: System clock was stepped by 0.000000 seconds Feb 1 02:50:55 localhost python3[45150]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:55 localhost python3[45167]: ansible-timezone Invoked with name=UTC hwclock=None Feb 1 02:50:55 localhost systemd[1]: Starting Time & Date Service... Feb 1 02:50:55 localhost systemd[1]: Started Time & Date Service. Feb 1 02:50:56 localhost python3[45187]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:57 localhost python3[45204]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:50:58 localhost python3[45221]: ansible-slurp Invoked with src=/etc/tuned/active_profile Feb 1 02:50:58 localhost python3[45237]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:50:58 localhost python3[45253]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:59 localhost python3[45269]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:50:59 localhost python3[45317]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:51:00 localhost python3[45360]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932259.4893556-78658-176349789089326/source _original_basename=tmpw0f24v23 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:51:00 localhost python3[45422]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:51:01 localhost python3[45465]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932260.3792288-78700-54051348764864/source _original_basename=tmp3vz00moa follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:51:01 localhost python3[45495]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 1 02:51:01 localhost systemd[1]: Reloading. Feb 1 02:51:01 localhost systemd-rc-local-generator[45522]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:51:01 localhost systemd-sysv-generator[45529]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:51:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:51:02 localhost python3[45550]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:51:03 localhost python3[45566]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:51:03 localhost systemd[35572]: Created slice User Background Tasks Slice. Feb 1 02:51:03 localhost systemd[35572]: Starting Cleanup of User's Temporary Files and Directories... Feb 1 02:51:03 localhost systemd[35572]: Finished Cleanup of User's Temporary Files and Directories. Feb 1 02:51:03 localhost python3[45584]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:51:03 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Feb 1 02:51:03 localhost python3[45601]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:51:04 localhost python3[45617]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:51:04 localhost python3[45665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:51:05 localhost python3[45708]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932264.611025-78971-104679794532086/source _original_basename=tmp83b8kmqw follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:51:25 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 1 02:51:28 localhost python3[45817]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 1 02:51:28 localhost python3[45833]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Feb 1 02:51:28 localhost python3[45849]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 02:51:29 localhost python3[45865]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:51:29 localhost python3[45881]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:51:29 localhost python3[45897]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 1 02:51:30 localhost kernel: SELinux: Converting 2707 SID table entries... Feb 1 02:51:30 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 02:51:30 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 02:51:30 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 02:51:30 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 02:51:30 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 02:51:30 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 02:51:30 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 02:51:30 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=13 res=1 Feb 1 02:51:31 localhost python3[45919]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:51:33 localhost python3[46056]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Feb 1 02:51:33 localhost rsyslogd[758]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Feb 1 02:51:33 localhost python3[46072]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 02:51:33 localhost python3[46088]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 1 02:51:34 localhost python3[46104]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n -iNONE', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Feb 1 02:51:39 localhost python3[46152]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:51:39 localhost python3[46195]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932299.1082778-80416-61611482694553/source _original_basename=tmp1rrdgy7z follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:51:40 localhost python3[46225]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:51:42 localhost python3[46348]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 02:51:44 localhost python3[46470]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 1 02:51:45 localhost python3[46486]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:51:46 localhost python3[46503]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 02:51:51 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 1 02:51:51 localhost dbus-broker-launch[18418]: Noticed file-system modification, trigger reload. Feb 1 02:51:51 localhost dbus-broker-launch[18418]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Feb 1 02:51:51 localhost dbus-broker-launch[18418]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Feb 1 02:51:51 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 1 02:51:51 localhost systemd[1]: Reexecuting. Feb 1 02:51:51 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 1 02:51:51 localhost systemd[1]: Detected virtualization kvm. Feb 1 02:51:51 localhost systemd[1]: Detected architecture x86-64. Feb 1 02:51:51 localhost systemd-sysv-generator[46561]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:51:51 localhost systemd-rc-local-generator[46557]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:51:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:51:53 localhost sshd[46578]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:52:02 localhost kernel: SELinux: Converting 2707 SID table entries... Feb 1 02:52:02 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 02:52:02 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 02:52:02 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 02:52:02 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 02:52:02 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 02:52:02 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 02:52:02 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 02:52:02 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 1 02:52:02 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=14 res=1 Feb 1 02:52:02 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 1 02:52:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:52:04 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 02:52:04 localhost systemd[1]: Reloading. Feb 1 02:52:04 localhost systemd-rc-local-generator[46712]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:52:04 localhost systemd-sysv-generator[46716]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:52:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:52:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 02:52:04 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 02:52:04 localhost systemd-journald[618]: Journal stopped Feb 1 02:52:04 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Feb 1 02:52:04 localhost systemd[1]: Stopping Journal Service... Feb 1 02:52:04 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Feb 1 02:52:04 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Feb 1 02:52:04 localhost systemd[1]: Stopped Journal Service. Feb 1 02:52:04 localhost systemd[1]: systemd-journald.service: Consumed 1.908s CPU time. Feb 1 02:52:04 localhost systemd[1]: Starting Journal Service... Feb 1 02:52:04 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 1 02:52:04 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Feb 1 02:52:04 localhost systemd[1]: systemd-udevd.service: Consumed 2.988s CPU time. Feb 1 02:52:04 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 1 02:52:04 localhost systemd-journald[47041]: Journal started Feb 1 02:52:04 localhost systemd-journald[47041]: Runtime Journal (/run/log/journal/00836dadc27b01f9fb0a211cca69e688) is 12.1M, max 314.7M, 302.6M free. Feb 1 02:52:04 localhost systemd[1]: Started Journal Service. Feb 1 02:52:04 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 1 02:52:04 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 02:52:04 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 02:52:04 localhost systemd-udevd[47048]: Using default interface naming scheme 'rhel-9.0'. Feb 1 02:52:04 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 1 02:52:04 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 02:52:04 localhost systemd[1]: Reloading. Feb 1 02:52:05 localhost systemd-rc-local-generator[47591]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:52:05 localhost systemd-sysv-generator[47596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:52:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:52:05 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 02:52:05 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 02:52:05 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 02:52:05 localhost systemd[1]: man-db-cache-update.service: Consumed 1.632s CPU time. Feb 1 02:52:05 localhost systemd[1]: run-r94ab942d42ac48fabd61641066b18753.service: Deactivated successfully. Feb 1 02:52:05 localhost systemd[1]: run-rf76c896af45c488682a8b231294d1ceb.service: Deactivated successfully. Feb 1 02:52:07 localhost python3[47996]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Feb 1 02:52:07 localhost python3[48015]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 02:52:08 localhost python3[48033]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:52:08 localhost python3[48033]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Feb 1 02:52:09 localhost python3[48033]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Feb 1 02:52:16 localhost podman[48045]: 2026-02-01 07:52:09.087844819 +0000 UTC m=+0.038229595 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 1 02:52:16 localhost python3[48033]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 591bb9fb46a70e9f840f28502388406078442df6b6701a3c17990ee75e333673 --format json Feb 1 02:52:16 localhost python3[48146]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:52:16 localhost python3[48146]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Feb 1 02:52:16 localhost python3[48146]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Feb 1 02:52:23 localhost podman[48159]: 2026-02-01 07:52:16.879409629 +0000 UTC m=+0.043801960 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 1 02:52:23 localhost python3[48146]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d59b33e7fb841c47a47a12b18fb68b11debd968b4596c63f3177ecc7400fb1bc --format json Feb 1 02:52:24 localhost python3[48263]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:52:24 localhost python3[48263]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Feb 1 02:52:24 localhost python3[48263]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Feb 1 02:52:26 localhost podman[48410]: 2026-02-01 07:52:26.389824601 +0000 UTC m=+0.092731563 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-type=git) Feb 1 02:52:26 localhost podman[48410]: 2026-02-01 07:52:26.467371713 +0000 UTC m=+0.170278695 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Feb 1 02:52:41 localhost podman[48276]: 2026-02-01 07:52:24.286513765 +0000 UTC m=+0.042932903 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 02:52:41 localhost python3[48263]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 6eddd23e1e6adfbfa713a747123707c02f92ffdbf1913da92f171aba1d6d7856 --format json Feb 1 02:52:41 localhost python3[49372]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:52:41 localhost python3[49372]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Feb 1 02:52:41 localhost python3[49372]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Feb 1 02:52:55 localhost podman[49385]: 2026-02-01 07:52:41.725375188 +0000 UTC m=+0.044371267 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 02:52:55 localhost python3[49372]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 2c8610235afe953aa46efb141a5a988799548b22280d65a7e7ab21889422df37 --format json Feb 1 02:52:55 localhost python3[49473]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:52:55 localhost python3[49473]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Feb 1 02:52:55 localhost python3[49473]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Feb 1 02:53:03 localhost podman[49486]: 2026-02-01 07:52:55.710849557 +0000 UTC m=+0.048950026 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 1 02:53:03 localhost python3[49473]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ab5aab6d0c3ec80926032b7acf4cec1d4710f1c2daccd17ae4daa64399ec237 --format json Feb 1 02:53:03 localhost python3[49631]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:53:03 localhost python3[49631]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Feb 1 02:53:03 localhost python3[49631]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Feb 1 02:53:08 localhost podman[49644]: 2026-02-01 07:53:03.97637392 +0000 UTC m=+0.050649447 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 1 02:53:08 localhost python3[49631]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 4853142d85dba3766b28d28ae195b26f7242230fe3646e9590a7aee2dc2e0dfa --format json Feb 1 02:53:08 localhost python3[49723]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:53:08 localhost python3[49723]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Feb 1 02:53:08 localhost python3[49723]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Feb 1 02:53:10 localhost podman[49737]: 2026-02-01 07:53:08.700312143 +0000 UTC m=+0.051405080 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 1 02:53:10 localhost python3[49723]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ac6ea63c0fb4851145e847f9ced2f20804afc8472907b63a82d5866f5cf608a --format json Feb 1 02:53:11 localhost python3[49816]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:53:11 localhost python3[49816]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Feb 1 02:53:11 localhost python3[49816]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Feb 1 02:53:12 localhost podman[49827]: 2026-02-01 07:53:11.165321499 +0000 UTC m=+0.034660392 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 1 02:53:12 localhost python3[49816]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ba1a08ea1c1207b471b1f02cee16ff456b8a812662cce16906d16de330a66d63 --format json Feb 1 02:53:13 localhost python3[49905]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:53:13 localhost python3[49905]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Feb 1 02:53:13 localhost python3[49905]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Feb 1 02:53:15 localhost podman[49919]: 2026-02-01 07:53:13.330254154 +0000 UTC m=+0.029905228 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 1 02:53:15 localhost python3[49905]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 8576d3a17e57ea28f29435f132f583320941b5aa7bf0aa02e998b09a094d1fe8 --format json Feb 1 02:53:15 localhost python3[49997]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:53:15 localhost python3[49997]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Feb 1 02:53:15 localhost python3[49997]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Feb 1 02:53:19 localhost podman[50010]: 2026-02-01 07:53:15.693092051 +0000 UTC m=+0.048587004 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 1 02:53:19 localhost python3[49997]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7fcbf63c0504494c8fcaa07583f909a06486472a0982aeac9554c6fdbeb04c9a --format json Feb 1 02:53:19 localhost python3[50101]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 1 02:53:19 localhost python3[50101]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Feb 1 02:53:19 localhost python3[50101]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Feb 1 02:53:21 localhost podman[50113]: 2026-02-01 07:53:19.544345016 +0000 UTC m=+0.045268464 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 1 02:53:21 localhost python3[50101]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 72ddf109f135b64d3116af7b84caaa358dc72e2e60f4c8753fa54fa65b76ba35 --format json Feb 1 02:53:21 localhost python3[50189]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:53:23 localhost ansible-async_wrapper.py[50361]: Invoked with 86170463697 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932403.0301359-83129-16634298973045/AnsiballZ_command.py _ Feb 1 02:53:23 localhost ansible-async_wrapper.py[50364]: Starting module and watcher Feb 1 02:53:23 localhost ansible-async_wrapper.py[50364]: Start watching 50365 (3600) Feb 1 02:53:23 localhost ansible-async_wrapper.py[50365]: Start module (50365) Feb 1 02:53:23 localhost ansible-async_wrapper.py[50361]: Return async_wrapper task started. Feb 1 02:53:23 localhost python3[50385]: ansible-ansible.legacy.async_status Invoked with jid=86170463697.50361 mode=status _async_dir=/tmp/.ansible_async Feb 1 02:53:27 localhost puppet-user[50383]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:27 localhost puppet-user[50383]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:27 localhost puppet-user[50383]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:27 localhost puppet-user[50383]: (file & line not available) Feb 1 02:53:27 localhost puppet-user[50383]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:27 localhost puppet-user[50383]: (file & line not available) Feb 1 02:53:27 localhost puppet-user[50383]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 1 02:53:28 localhost puppet-user[50383]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 1 02:53:28 localhost puppet-user[50383]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.17 seconds Feb 1 02:53:28 localhost puppet-user[50383]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Feb 1 02:53:28 localhost puppet-user[50383]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Feb 1 02:53:28 localhost puppet-user[50383]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Feb 1 02:53:28 localhost puppet-user[50383]: Notice: Applied catalog in 0.08 seconds Feb 1 02:53:28 localhost puppet-user[50383]: Application: Feb 1 02:53:28 localhost puppet-user[50383]: Initial environment: production Feb 1 02:53:28 localhost puppet-user[50383]: Converged environment: production Feb 1 02:53:28 localhost puppet-user[50383]: Run mode: user Feb 1 02:53:28 localhost puppet-user[50383]: Changes: Feb 1 02:53:28 localhost puppet-user[50383]: Total: 3 Feb 1 02:53:28 localhost puppet-user[50383]: Events: Feb 1 02:53:28 localhost puppet-user[50383]: Success: 3 Feb 1 02:53:28 localhost puppet-user[50383]: Total: 3 Feb 1 02:53:28 localhost puppet-user[50383]: Resources: Feb 1 02:53:28 localhost puppet-user[50383]: Changed: 3 Feb 1 02:53:28 localhost puppet-user[50383]: Out of sync: 3 Feb 1 02:53:28 localhost puppet-user[50383]: Total: 10 Feb 1 02:53:28 localhost puppet-user[50383]: Time: Feb 1 02:53:28 localhost puppet-user[50383]: Schedule: 0.00 Feb 1 02:53:28 localhost puppet-user[50383]: File: 0.00 Feb 1 02:53:28 localhost puppet-user[50383]: Exec: 0.01 Feb 1 02:53:28 localhost puppet-user[50383]: Augeas: 0.04 Feb 1 02:53:28 localhost puppet-user[50383]: Transaction evaluation: 0.07 Feb 1 02:53:28 localhost puppet-user[50383]: Catalog application: 0.08 Feb 1 02:53:28 localhost puppet-user[50383]: Config retrieval: 0.22 Feb 1 02:53:28 localhost puppet-user[50383]: Last run: 1769932408 Feb 1 02:53:28 localhost puppet-user[50383]: Filebucket: 0.00 Feb 1 02:53:28 localhost puppet-user[50383]: Total: 0.08 Feb 1 02:53:28 localhost puppet-user[50383]: Version: Feb 1 02:53:28 localhost puppet-user[50383]: Config: 1769932407 Feb 1 02:53:28 localhost puppet-user[50383]: Puppet: 7.10.0 Feb 1 02:53:28 localhost ansible-async_wrapper.py[50365]: Module complete (50365) Feb 1 02:53:28 localhost ansible-async_wrapper.py[50364]: Done in kid B. Feb 1 02:53:34 localhost python3[50512]: ansible-ansible.legacy.async_status Invoked with jid=86170463697.50361 mode=status _async_dir=/tmp/.ansible_async Feb 1 02:53:34 localhost python3[50528]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 02:53:35 localhost python3[50544]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:53:35 localhost python3[50592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:53:36 localhost python3[50635]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932415.460875-83514-91569104865966/source _original_basename=tmp40iebw7d follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 02:53:36 localhost python3[50665]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:53:37 localhost python3[50768]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 1 02:53:38 localhost python3[50787]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 02:53:38 localhost python3[50803]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005604212 step=1 update_config_hash_only=False Feb 1 02:53:39 localhost python3[50819]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:53:39 localhost python3[50865]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 1 02:53:40 localhost python3[50914]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Feb 1 02:53:40 localhost python3[50955]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Feb 1 02:53:41 localhost podman[51120]: 2026-02-01 07:53:41.318880487 +0000 UTC m=+0.089245488 container create 8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, release=1766032510, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc., container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 02:53:41 localhost podman[51141]: 2026-02-01 07:53:41.355916581 +0000 UTC m=+0.113492043 container create 077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 02:53:41 localhost systemd[1]: Started libpod-conmon-8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076.scope. Feb 1 02:53:41 localhost podman[51120]: 2026-02-01 07:53:41.267792638 +0000 UTC m=+0.038157649 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 1 02:53:41 localhost podman[51169]: 2026-02-01 07:53:41.385829998 +0000 UTC m=+0.095213199 container create 3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com) Feb 1 02:53:41 localhost systemd[1]: Started libcrun container. Feb 1 02:53:41 localhost systemd[1]: Started libpod-conmon-077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518.scope. Feb 1 02:53:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1802b2d0e385fe8def6f3d058db8d9dc0c862ef018584968ffe99962c492a423/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:41 localhost podman[51176]: 2026-02-01 07:53:41.393138889 +0000 UTC m=+0.100798728 container create 1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, managed_by=tripleo_ansible, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_puppet_step1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-nova_libvirt, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container) Feb 1 02:53:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1802b2d0e385fe8def6f3d058db8d9dc0c862ef018584968ffe99962c492a423/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:41 localhost systemd[1]: Started libcrun container. Feb 1 02:53:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c7ec790140e1e90bc11292f42ed59a62edd99a1fd979365ce8afccf944e3222/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:41 localhost podman[51141]: 2026-02-01 07:53:41.40997297 +0000 UTC m=+0.167548442 container init 077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, container_name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 1 02:53:41 localhost podman[51141]: 2026-02-01 07:53:41.311219225 +0000 UTC m=+0.068794697 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 1 02:53:41 localhost podman[51141]: 2026-02-01 07:53:41.420776268 +0000 UTC m=+0.178351750 container start 077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=container-puppet-metrics_qdr, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 02:53:41 localhost podman[51141]: 2026-02-01 07:53:41.422653205 +0000 UTC m=+0.180228877 container attach 077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_puppet_step1, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 1 02:53:41 localhost podman[51176]: 2026-02-01 07:53:41.322088135 +0000 UTC m=+0.029747984 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 02:53:41 localhost systemd[1]: Started libpod-conmon-1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25.scope. Feb 1 02:53:41 localhost podman[51169]: 2026-02-01 07:53:41.34833773 +0000 UTC m=+0.057720931 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 1 02:53:41 localhost systemd[1]: Started libcrun container. Feb 1 02:53:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f95e8251fb179e771f128b38ef4d608f2b943f904632add746cd54aaa3444e8d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:41 localhost systemd[1]: Started libpod-conmon-3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0.scope. Feb 1 02:53:41 localhost systemd[1]: Started libcrun container. Feb 1 02:53:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9acaf9dad4414fbe24154776e6121c44ca4dc128df9f993be24501b6fc5f7e69/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:42 localhost podman[51120]: 2026-02-01 07:53:42.498214358 +0000 UTC m=+1.268579389 container init 8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=container-puppet-iscsid, vcs-type=git) Feb 1 02:53:42 localhost podman[51120]: 2026-02-01 07:53:42.519465682 +0000 UTC m=+1.289830693 container start 8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:34:43Z, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 1 02:53:42 localhost podman[51120]: 2026-02-01 07:53:42.519761231 +0000 UTC m=+1.290126232 container attach 8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=container-puppet-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid) Feb 1 02:53:42 localhost podman[51169]: 2026-02-01 07:53:42.550441561 +0000 UTC m=+1.259824762 container init 3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 02:53:42 localhost podman[51169]: 2026-02-01 07:53:42.561881129 +0000 UTC m=+1.271264340 container start 3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, vcs-type=git, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, container_name=container-puppet-collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.buildah.version=1.41.5) Feb 1 02:53:42 localhost podman[51169]: 2026-02-01 07:53:42.562277811 +0000 UTC m=+1.271661012 container attach 3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=container-puppet-collectd, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 02:53:42 localhost podman[51176]: 2026-02-01 07:53:42.573582314 +0000 UTC m=+1.281242193 container init 1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, container_name=container-puppet-nova_libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:31:49Z, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt) Feb 1 02:53:42 localhost podman[51161]: 2026-02-01 07:53:42.576972087 +0000 UTC m=+1.304426066 container create e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, vendor=Red Hat, Inc.) Feb 1 02:53:42 localhost podman[51176]: 2026-02-01 07:53:42.603080988 +0000 UTC m=+1.310740847 container start 1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_id=tripleo_puppet_step1, io.openshift.expose-services=, version=17.1.13, release=1766032510, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=container-puppet-nova_libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 1 02:53:42 localhost podman[51176]: 2026-02-01 07:53:42.603458129 +0000 UTC m=+1.311117968 container attach 1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, container_name=container-puppet-nova_libvirt, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, release=1766032510) Feb 1 02:53:42 localhost podman[51161]: 2026-02-01 07:53:42.528371703 +0000 UTC m=+1.255825652 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 1 02:53:42 localhost systemd[1]: Started libpod-conmon-e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e.scope. Feb 1 02:53:42 localhost systemd[1]: Started libcrun container. Feb 1 02:53:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f37f2c5aca5d6b7cf023a2735a5fd14989ee6decd2a2c44bb6d47fd78dfeef3e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:42 localhost podman[51161]: 2026-02-01 07:53:42.711234119 +0000 UTC m=+1.438688088 container init e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1) Feb 1 02:53:42 localhost podman[51161]: 2026-02-01 07:53:42.720408937 +0000 UTC m=+1.447862886 container start e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 02:53:42 localhost podman[51161]: 2026-02-01 07:53:42.720704616 +0000 UTC m=+1.448158575 container attach e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, container_name=container-puppet-crond, io.openshift.expose-services=, release=1766032510, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 02:53:43 localhost podman[51034]: 2026-02-01 07:53:41.165203756 +0000 UTC m=+0.030855417 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 1 02:53:43 localhost podman[51369]: 2026-02-01 07:53:43.452514002 +0000 UTC m=+0.093058623 container create c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, org.opencontainers.image.created=2026-01-12T23:07:24Z, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2026-01-12T23:07:24Z, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-central, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_puppet_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central) Feb 1 02:53:43 localhost systemd[1]: Started libpod-conmon-c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291.scope. Feb 1 02:53:43 localhost systemd[1]: Started libcrun container. Feb 1 02:53:43 localhost podman[51369]: 2026-02-01 07:53:43.40462724 +0000 UTC m=+0.045171911 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 1 02:53:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81d35b92007b3f6ce5557fdf3066e145410fe8d50cd18a17188fe6e802a41d49/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:43 localhost podman[51369]: 2026-02-01 07:53:43.525564699 +0000 UTC m=+0.166109300 container init c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2026-01-12T23:07:24Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:24Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.component=openstack-ceilometer-central-container, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, container_name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 02:53:43 localhost podman[51369]: 2026-02-01 07:53:43.54110444 +0000 UTC m=+0.181649031 container start c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:24Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:24Z, config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-ceilometer-central, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-central-container, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central) Feb 1 02:53:43 localhost podman[51369]: 2026-02-01 07:53:43.541386038 +0000 UTC m=+0.181930629 container attach c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, architecture=x86_64, build-date=2026-01-12T23:07:24Z, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:24Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, com.redhat.component=openstack-ceilometer-central-container, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 02:53:44 localhost systemd[1]: tmp-crun.YhrCVz.mount: Deactivated successfully. Feb 1 02:53:44 localhost ovs-vsctl[51454]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Feb 1 02:53:44 localhost puppet-user[51290]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:44 localhost puppet-user[51290]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:44 localhost puppet-user[51290]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:44 localhost puppet-user[51290]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51308]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:44 localhost puppet-user[51308]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:44 localhost puppet-user[51308]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:44 localhost puppet-user[51308]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51290]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:44 localhost puppet-user[51290]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51308]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:44 localhost puppet-user[51308]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51290]: Notice: Accepting previously invalid value for target type 'Integer' Feb 1 02:53:44 localhost puppet-user[51320]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:44 localhost puppet-user[51320]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:44 localhost puppet-user[51320]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:44 localhost puppet-user[51320]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51292]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:44 localhost puppet-user[51292]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:44 localhost puppet-user[51292]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:44 localhost puppet-user[51292]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51320]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:44 localhost puppet-user[51320]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51290]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.14 seconds Feb 1 02:53:44 localhost puppet-user[51292]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:44 localhost puppet-user[51292]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51290]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Feb 1 02:53:44 localhost puppet-user[51290]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Feb 1 02:53:44 localhost puppet-user[51290]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Feb 1 02:53:44 localhost puppet-user[51290]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Feb 1 02:53:44 localhost puppet-user[51290]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}ffb65dc358aca94506f86486de9f02ca445b94fb37daa17fd674df47787c99c7' Feb 1 02:53:44 localhost puppet-user[51290]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Feb 1 02:53:44 localhost puppet-user[51290]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Feb 1 02:53:44 localhost puppet-user[51290]: Notice: Applied catalog in 0.03 seconds Feb 1 02:53:44 localhost puppet-user[51292]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.11 seconds Feb 1 02:53:44 localhost puppet-user[51290]: Application: Feb 1 02:53:44 localhost puppet-user[51290]: Initial environment: production Feb 1 02:53:44 localhost puppet-user[51290]: Converged environment: production Feb 1 02:53:44 localhost puppet-user[51290]: Run mode: user Feb 1 02:53:44 localhost puppet-user[51290]: Changes: Feb 1 02:53:44 localhost puppet-user[51290]: Total: 7 Feb 1 02:53:44 localhost puppet-user[51290]: Events: Feb 1 02:53:44 localhost puppet-user[51290]: Success: 7 Feb 1 02:53:44 localhost puppet-user[51290]: Total: 7 Feb 1 02:53:44 localhost puppet-user[51290]: Resources: Feb 1 02:53:44 localhost puppet-user[51290]: Skipped: 13 Feb 1 02:53:44 localhost puppet-user[51290]: Changed: 5 Feb 1 02:53:44 localhost puppet-user[51290]: Out of sync: 5 Feb 1 02:53:44 localhost puppet-user[51290]: Total: 20 Feb 1 02:53:44 localhost puppet-user[51290]: Time: Feb 1 02:53:44 localhost puppet-user[51290]: File: 0.01 Feb 1 02:53:44 localhost puppet-user[51290]: Transaction evaluation: 0.03 Feb 1 02:53:44 localhost puppet-user[51290]: Catalog application: 0.03 Feb 1 02:53:44 localhost puppet-user[51290]: Config retrieval: 0.17 Feb 1 02:53:44 localhost puppet-user[51290]: Last run: 1769932424 Feb 1 02:53:44 localhost puppet-user[51290]: Total: 0.03 Feb 1 02:53:44 localhost puppet-user[51290]: Version: Feb 1 02:53:44 localhost puppet-user[51290]: Config: 1769932424 Feb 1 02:53:44 localhost puppet-user[51290]: Puppet: 7.10.0 Feb 1 02:53:44 localhost puppet-user[51292]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Feb 1 02:53:44 localhost puppet-user[51292]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Feb 1 02:53:44 localhost puppet-user[51337]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:44 localhost puppet-user[51337]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:44 localhost puppet-user[51337]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:44 localhost puppet-user[51337]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51292]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Feb 1 02:53:44 localhost puppet-user[51337]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:44 localhost puppet-user[51337]: (file & line not available) Feb 1 02:53:44 localhost puppet-user[51320]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Feb 1 02:53:44 localhost puppet-user[51320]: in a future release. Use nova::cinder::os_region_name instead Feb 1 02:53:44 localhost puppet-user[51320]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Feb 1 02:53:44 localhost puppet-user[51320]: in a future release. Use nova::cinder::catalog_info instead Feb 1 02:53:44 localhost puppet-user[51337]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.10 seconds Feb 1 02:53:44 localhost puppet-user[51308]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.36 seconds Feb 1 02:53:44 localhost puppet-user[51337]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Feb 1 02:53:44 localhost puppet-user[51337]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Feb 1 02:53:44 localhost puppet-user[51337]: Notice: Applied catalog in 0.07 seconds Feb 1 02:53:44 localhost puppet-user[51337]: Application: Feb 1 02:53:44 localhost puppet-user[51337]: Initial environment: production Feb 1 02:53:44 localhost puppet-user[51337]: Converged environment: production Feb 1 02:53:44 localhost puppet-user[51337]: Run mode: user Feb 1 02:53:44 localhost puppet-user[51337]: Changes: Feb 1 02:53:44 localhost puppet-user[51337]: Total: 2 Feb 1 02:53:44 localhost puppet-user[51337]: Events: Feb 1 02:53:44 localhost puppet-user[51337]: Success: 2 Feb 1 02:53:44 localhost puppet-user[51337]: Total: 2 Feb 1 02:53:44 localhost puppet-user[51337]: Resources: Feb 1 02:53:44 localhost puppet-user[51337]: Changed: 2 Feb 1 02:53:44 localhost puppet-user[51337]: Out of sync: 2 Feb 1 02:53:44 localhost puppet-user[51337]: Skipped: 7 Feb 1 02:53:44 localhost puppet-user[51337]: Total: 9 Feb 1 02:53:44 localhost puppet-user[51337]: Time: Feb 1 02:53:44 localhost puppet-user[51337]: File: 0.02 Feb 1 02:53:44 localhost puppet-user[51337]: Cron: 0.02 Feb 1 02:53:44 localhost puppet-user[51337]: Transaction evaluation: 0.06 Feb 1 02:53:44 localhost puppet-user[51337]: Catalog application: 0.07 Feb 1 02:53:44 localhost puppet-user[51337]: Config retrieval: 0.13 Feb 1 02:53:44 localhost puppet-user[51337]: Last run: 1769932424 Feb 1 02:53:44 localhost puppet-user[51337]: Total: 0.07 Feb 1 02:53:44 localhost puppet-user[51337]: Version: Feb 1 02:53:44 localhost puppet-user[51337]: Config: 1769932424 Feb 1 02:53:44 localhost puppet-user[51337]: Puppet: 7.10.0 Feb 1 02:53:44 localhost puppet-user[51320]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Feb 1 02:53:44 localhost systemd[1]: libpod-077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518.scope: Deactivated successfully. Feb 1 02:53:44 localhost systemd[1]: libpod-077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518.scope: Consumed 2.339s CPU time. Feb 1 02:53:44 localhost podman[51141]: 2026-02-01 07:53:44.994581096 +0000 UTC m=+3.752156578 container died 077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, container_name=container-puppet-metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_puppet_step1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vendor=Red Hat, Inc.) Feb 1 02:53:44 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Feb 1 02:53:44 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Feb 1 02:53:44 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Feb 1 02:53:44 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Feb 1 02:53:45 localhost systemd[1]: tmp-crun.5oLub3.mount: Deactivated successfully. Feb 1 02:53:45 localhost puppet-user[51320]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Feb 1 02:53:45 localhost puppet-user[51320]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Feb 1 02:53:45 localhost puppet-user[51320]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Feb 1 02:53:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Feb 1 02:53:45 localhost puppet-user[51320]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Feb 1 02:53:45 localhost puppet-user[51320]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Feb 1 02:53:45 localhost puppet-user[51320]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Feb 1 02:53:45 localhost podman[51771]: 2026-02-01 07:53:45.128824988 +0000 UTC m=+0.126746716 container cleanup 077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-metrics_qdr, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39' Feb 1 02:53:45 localhost systemd[1]: libpod-conmon-077e2b9dea47fc01273abed6d75b379f81a2a98b4ee7aa2c4e8e11911c203518.scope: Deactivated successfully. Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Feb 1 02:53:45 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Feb 1 02:53:45 localhost puppet-user[51308]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Feb 1 02:53:45 localhost puppet-user[51308]: Notice: Applied catalog in 0.26 seconds Feb 1 02:53:45 localhost puppet-user[51308]: Application: Feb 1 02:53:45 localhost puppet-user[51308]: Initial environment: production Feb 1 02:53:45 localhost puppet-user[51308]: Converged environment: production Feb 1 02:53:45 localhost puppet-user[51308]: Run mode: user Feb 1 02:53:45 localhost puppet-user[51308]: Changes: Feb 1 02:53:45 localhost puppet-user[51308]: Total: 43 Feb 1 02:53:45 localhost puppet-user[51308]: Events: Feb 1 02:53:45 localhost puppet-user[51308]: Success: 43 Feb 1 02:53:45 localhost puppet-user[51308]: Total: 43 Feb 1 02:53:45 localhost puppet-user[51308]: Resources: Feb 1 02:53:45 localhost puppet-user[51308]: Skipped: 14 Feb 1 02:53:45 localhost puppet-user[51308]: Changed: 38 Feb 1 02:53:45 localhost puppet-user[51308]: Out of sync: 38 Feb 1 02:53:45 localhost puppet-user[51308]: Total: 82 Feb 1 02:53:45 localhost puppet-user[51308]: Time: Feb 1 02:53:45 localhost puppet-user[51308]: Concat file: 0.00 Feb 1 02:53:45 localhost puppet-user[51308]: Concat fragment: 0.00 Feb 1 02:53:45 localhost puppet-user[51308]: File: 0.15 Feb 1 02:53:45 localhost puppet-user[51308]: Transaction evaluation: 0.25 Feb 1 02:53:45 localhost puppet-user[51308]: Catalog application: 0.26 Feb 1 02:53:45 localhost puppet-user[51308]: Config retrieval: 0.49 Feb 1 02:53:45 localhost puppet-user[51308]: Last run: 1769932425 Feb 1 02:53:45 localhost puppet-user[51308]: Total: 0.26 Feb 1 02:53:45 localhost puppet-user[51308]: Version: Feb 1 02:53:45 localhost puppet-user[51308]: Config: 1769932424 Feb 1 02:53:45 localhost puppet-user[51308]: Puppet: 7.10.0 Feb 1 02:53:45 localhost puppet-user[51292]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Feb 1 02:53:45 localhost systemd[1]: libpod-e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e.scope: Deactivated successfully. Feb 1 02:53:45 localhost systemd[1]: libpod-e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e.scope: Consumed 2.310s CPU time. Feb 1 02:53:45 localhost podman[51161]: 2026-02-01 07:53:45.293811452 +0000 UTC m=+4.021265411 container died e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 1 02:53:45 localhost puppet-user[51292]: Notice: Applied catalog in 0.62 seconds Feb 1 02:53:45 localhost puppet-user[51292]: Application: Feb 1 02:53:45 localhost puppet-user[51292]: Initial environment: production Feb 1 02:53:45 localhost puppet-user[51292]: Converged environment: production Feb 1 02:53:45 localhost puppet-user[51292]: Run mode: user Feb 1 02:53:45 localhost puppet-user[51292]: Changes: Feb 1 02:53:45 localhost puppet-user[51292]: Total: 4 Feb 1 02:53:45 localhost puppet-user[51292]: Events: Feb 1 02:53:45 localhost puppet-user[51292]: Success: 4 Feb 1 02:53:45 localhost puppet-user[51292]: Total: 4 Feb 1 02:53:45 localhost puppet-user[51292]: Resources: Feb 1 02:53:45 localhost puppet-user[51292]: Changed: 4 Feb 1 02:53:45 localhost puppet-user[51292]: Out of sync: 4 Feb 1 02:53:45 localhost puppet-user[51292]: Skipped: 8 Feb 1 02:53:45 localhost puppet-user[51292]: Total: 13 Feb 1 02:53:45 localhost puppet-user[51292]: Time: Feb 1 02:53:45 localhost puppet-user[51292]: File: 0.00 Feb 1 02:53:45 localhost puppet-user[51292]: Exec: 0.07 Feb 1 02:53:45 localhost puppet-user[51292]: Config retrieval: 0.15 Feb 1 02:53:45 localhost puppet-user[51292]: Augeas: 0.47 Feb 1 02:53:45 localhost puppet-user[51292]: Transaction evaluation: 0.55 Feb 1 02:53:45 localhost puppet-user[51292]: Catalog application: 0.62 Feb 1 02:53:45 localhost puppet-user[51292]: Last run: 1769932425 Feb 1 02:53:45 localhost puppet-user[51292]: Total: 0.62 Feb 1 02:53:45 localhost puppet-user[51292]: Version: Feb 1 02:53:45 localhost puppet-user[51292]: Config: 1769932424 Feb 1 02:53:45 localhost puppet-user[51292]: Puppet: 7.10.0 Feb 1 02:53:45 localhost systemd[1]: var-lib-containers-storage-overlay-4c7ec790140e1e90bc11292f42ed59a62edd99a1fd979365ce8afccf944e3222-merged.mount: Deactivated successfully. Feb 1 02:53:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:45 localhost systemd[1]: var-lib-containers-storage-overlay-f37f2c5aca5d6b7cf023a2735a5fd14989ee6decd2a2c44bb6d47fd78dfeef3e-merged.mount: Deactivated successfully. Feb 1 02:53:45 localhost puppet-user[51320]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Feb 1 02:53:45 localhost podman[51838]: 2026-02-01 07:53:45.361706351 +0000 UTC m=+0.060205038 container cleanup e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, container_name=container-puppet-crond, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com) Feb 1 02:53:45 localhost systemd[1]: libpod-conmon-e2b5ec44547f98b45022904c64027db7254eb077576f13e26d7c3430f91bd33e.scope: Deactivated successfully. Feb 1 02:53:45 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 1 02:53:45 localhost podman[51908]: 2026-02-01 07:53:45.584819549 +0000 UTC m=+0.114425792 container create 85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, distribution-scope=public, release=1766032510, container_name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13) Feb 1 02:53:45 localhost systemd[1]: Started libpod-conmon-85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80.scope. Feb 1 02:53:45 localhost systemd[1]: Started libcrun container. Feb 1 02:53:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c865b61b693adf7af1d8961d294b7f96decf325baf650c1221fd291eb417296d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:45 localhost systemd[1]: libpod-8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076.scope: Deactivated successfully. Feb 1 02:53:45 localhost systemd[1]: libpod-8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076.scope: Consumed 2.851s CPU time. Feb 1 02:53:45 localhost podman[51908]: 2026-02-01 07:53:45.641766456 +0000 UTC m=+0.171372699 container init 85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, container_name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 02:53:45 localhost podman[51908]: 2026-02-01 07:53:45.648818739 +0000 UTC m=+0.178424982 container start 85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-type=git, container_name=container-puppet-rsyslog, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z) Feb 1 02:53:45 localhost podman[51908]: 2026-02-01 07:53:45.652097149 +0000 UTC m=+0.181703402 container attach 85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, config_id=tripleo_puppet_step1) Feb 1 02:53:45 localhost podman[51908]: 2026-02-01 07:53:45.553939722 +0000 UTC m=+0.083545985 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 1 02:53:45 localhost systemd[1]: libpod-3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0.scope: Deactivated successfully. Feb 1 02:53:45 localhost systemd[1]: libpod-3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0.scope: Consumed 2.776s CPU time. Feb 1 02:53:45 localhost podman[51169]: 2026-02-01 07:53:45.723883646 +0000 UTC m=+4.433266847 container died 3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:45 localhost puppet-user[51407]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:45 localhost puppet-user[51407]: (file & line not available) Feb 1 02:53:45 localhost podman[52090]: 2026-02-01 07:53:45.784695121 +0000 UTC m=+0.052238416 container cleanup 3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, container_name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true) Feb 1 02:53:45 localhost podman[51120]: 2026-02-01 07:53:45.784894437 +0000 UTC m=+4.555259458 container died 8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5, container_name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 02:53:45 localhost systemd[1]: libpod-conmon-3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0.scope: Deactivated successfully. Feb 1 02:53:45 localhost puppet-user[51407]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:45 localhost puppet-user[51407]: (file & line not available) Feb 1 02:53:45 localhost podman[52005]: 2026-02-01 07:53:45.82754837 +0000 UTC m=+0.185118205 container cleanup 8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=container-puppet-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_puppet_step1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid) Feb 1 02:53:45 localhost systemd[1]: libpod-conmon-8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076.scope: Deactivated successfully. Feb 1 02:53:45 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 1 02:53:45 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Feb 1 02:53:45 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Feb 1 02:53:45 localhost podman[52137]: 2026-02-01 07:53:45.980640234 +0000 UTC m=+0.074854512 container create d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 02:53:46 localhost systemd[1]: Started libpod-conmon-d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110.scope. Feb 1 02:53:46 localhost puppet-user[51320]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 1.50 seconds Feb 1 02:53:46 localhost systemd[1]: Started libcrun container. Feb 1 02:53:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/412ecd0722770303c70661441b3031022487ea82d3402d0d51ca72c2eaf9a882/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/412ecd0722770303c70661441b3031022487ea82d3402d0d51ca72c2eaf9a882/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Feb 1 02:53:46 localhost podman[52137]: 2026-02-01 07:53:46.039546341 +0000 UTC m=+0.133760629 container init d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_puppet_step1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team) Feb 1 02:53:46 localhost podman[52137]: 2026-02-01 07:53:45.941549518 +0000 UTC m=+0.035763806 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Feb 1 02:53:46 localhost puppet-user[51407]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Feb 1 02:53:46 localhost podman[52137]: 2026-02-01 07:53:46.105527952 +0000 UTC m=+0.199742240 container start d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, container_name=container-puppet-ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 02:53:46 localhost podman[52137]: 2026-02-01 07:53:46.105917364 +0000 UTC m=+0.200131652 container attach d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=container-puppet-ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com) Feb 1 02:53:46 localhost puppet-user[51407]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.46 seconds Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Feb 1 02:53:46 localhost systemd[1]: var-lib-containers-storage-overlay-9acaf9dad4414fbe24154776e6121c44ca4dc128df9f993be24501b6fc5f7e69-merged.mount: Deactivated successfully. Feb 1 02:53:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3681a01f710e2b1548cb16a140f850a07d763049c7a3d9a721b75b1b5cfcbaf0-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:46 localhost systemd[1]: var-lib-containers-storage-overlay-1802b2d0e385fe8def6f3d058db8d9dc0c862ef018584968ffe99962c492a423-merged.mount: Deactivated successfully. Feb 1 02:53:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8751d33291ca755d4934a52ab7a2e2fe1ea5ef6ef78ecd9cb1e1249d55aba076-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}41fd6c6f800884fc5582fcd6978c5fdf9efd895ea286512b024eb4dc5635dca8' Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Feb 1 02:53:46 localhost puppet-user[51320]: Warning: Empty environment setting 'TLS_PASSWORD' Feb 1 02:53:46 localhost puppet-user[51320]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}ebc7fc3dcb9777cbffecb2db809cb7f56024c1a98bdd34554dbaaa8469bb0cdf' Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Feb 1 02:53:46 localhost puppet-user[51407]: Notice: Applied catalog in 0.58 seconds Feb 1 02:53:46 localhost puppet-user[51407]: Application: Feb 1 02:53:46 localhost puppet-user[51407]: Initial environment: production Feb 1 02:53:46 localhost puppet-user[51407]: Converged environment: production Feb 1 02:53:46 localhost puppet-user[51407]: Run mode: user Feb 1 02:53:46 localhost puppet-user[51407]: Changes: Feb 1 02:53:46 localhost puppet-user[51407]: Total: 31 Feb 1 02:53:46 localhost puppet-user[51407]: Events: Feb 1 02:53:46 localhost puppet-user[51407]: Success: 31 Feb 1 02:53:46 localhost puppet-user[51407]: Total: 31 Feb 1 02:53:46 localhost puppet-user[51407]: Resources: Feb 1 02:53:46 localhost puppet-user[51407]: Skipped: 22 Feb 1 02:53:46 localhost puppet-user[51407]: Changed: 31 Feb 1 02:53:46 localhost puppet-user[51407]: Out of sync: 31 Feb 1 02:53:46 localhost puppet-user[51407]: Total: 151 Feb 1 02:53:46 localhost puppet-user[51407]: Time: Feb 1 02:53:46 localhost puppet-user[51407]: Package: 0.03 Feb 1 02:53:46 localhost puppet-user[51407]: Ceilometer config: 0.40 Feb 1 02:53:46 localhost puppet-user[51407]: Transaction evaluation: 0.48 Feb 1 02:53:46 localhost puppet-user[51407]: Config retrieval: 0.53 Feb 1 02:53:46 localhost puppet-user[51407]: Catalog application: 0.58 Feb 1 02:53:46 localhost puppet-user[51407]: Last run: 1769932426 Feb 1 02:53:46 localhost puppet-user[51407]: Resources: 0.00 Feb 1 02:53:46 localhost puppet-user[51407]: Total: 0.59 Feb 1 02:53:46 localhost puppet-user[51407]: Version: Feb 1 02:53:46 localhost puppet-user[51407]: Config: 1769932425 Feb 1 02:53:46 localhost puppet-user[51407]: Puppet: 7.10.0 Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Feb 1 02:53:46 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Feb 1 02:53:47 localhost systemd[1]: libpod-c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291.scope: Deactivated successfully. Feb 1 02:53:47 localhost systemd[1]: libpod-c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291.scope: Consumed 3.356s CPU time. Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Feb 1 02:53:47 localhost podman[52265]: 2026-02-01 07:53:47.454518259 +0000 UTC m=+0.055938668 container died c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2026-01-12T23:07:24Z, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, container_name=container-puppet-ceilometer, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:24Z, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public) Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Feb 1 02:53:47 localhost systemd[1]: tmp-crun.QcDsgH.mount: Deactivated successfully. Feb 1 02:53:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:47 localhost systemd[1]: var-lib-containers-storage-overlay-81d35b92007b3f6ce5557fdf3066e145410fe8d50cd18a17188fe6e802a41d49-merged.mount: Deactivated successfully. Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Feb 1 02:53:47 localhost podman[52265]: 2026-02-01 07:53:47.514217919 +0000 UTC m=+0.115638278 container cleanup c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, com.redhat.component=openstack-ceilometer-central-container, build-date=2026-01-12T23:07:24Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:24Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-central, version=17.1.13, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, batch=17.1_20260112.1, container_name=container-puppet-ceilometer, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=) Feb 1 02:53:47 localhost systemd[1]: libpod-conmon-c440e2fbcf2042234b303c16180dae0676aa05703cfd4d76bfa4716143c2f291.scope: Deactivated successfully. Feb 1 02:53:47 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 1 02:53:47 localhost puppet-user[52095]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:47 localhost puppet-user[52095]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:47 localhost puppet-user[52095]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:47 localhost puppet-user[52095]: (file & line not available) Feb 1 02:53:47 localhost puppet-user[52095]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:47 localhost puppet-user[52095]: (file & line not available) Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Feb 1 02:53:47 localhost puppet-user[52095]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.28 seconds Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Feb 1 02:53:47 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:48 localhost puppet-user[52204]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:48 localhost puppet-user[52204]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:48 localhost puppet-user[52204]: (file & line not available) Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Feb 1 02:53:48 localhost puppet-user[52095]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:48 localhost puppet-user[52204]: (file & line not available) Feb 1 02:53:48 localhost puppet-user[52095]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Feb 1 02:53:48 localhost puppet-user[52095]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}21a1acd4d1017a4f98f12e0f791e842cc40e4ab1c500b96a2778c09e3e83b63a' Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Feb 1 02:53:48 localhost puppet-user[52095]: Notice: Applied catalog in 0.13 seconds Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Feb 1 02:53:48 localhost puppet-user[52095]: Application: Feb 1 02:53:48 localhost puppet-user[52095]: Initial environment: production Feb 1 02:53:48 localhost puppet-user[52095]: Converged environment: production Feb 1 02:53:48 localhost puppet-user[52095]: Run mode: user Feb 1 02:53:48 localhost puppet-user[52095]: Changes: Feb 1 02:53:48 localhost puppet-user[52095]: Total: 3 Feb 1 02:53:48 localhost puppet-user[52095]: Events: Feb 1 02:53:48 localhost puppet-user[52095]: Success: 3 Feb 1 02:53:48 localhost puppet-user[52095]: Total: 3 Feb 1 02:53:48 localhost puppet-user[52095]: Resources: Feb 1 02:53:48 localhost puppet-user[52095]: Skipped: 11 Feb 1 02:53:48 localhost puppet-user[52095]: Changed: 3 Feb 1 02:53:48 localhost puppet-user[52095]: Out of sync: 3 Feb 1 02:53:48 localhost puppet-user[52095]: Total: 25 Feb 1 02:53:48 localhost puppet-user[52095]: Time: Feb 1 02:53:48 localhost puppet-user[52095]: Concat file: 0.00 Feb 1 02:53:48 localhost puppet-user[52095]: Concat fragment: 0.00 Feb 1 02:53:48 localhost puppet-user[52095]: File: 0.02 Feb 1 02:53:48 localhost puppet-user[52095]: Transaction evaluation: 0.12 Feb 1 02:53:48 localhost puppet-user[52095]: Catalog application: 0.13 Feb 1 02:53:48 localhost puppet-user[52095]: Config retrieval: 0.33 Feb 1 02:53:48 localhost puppet-user[52095]: Last run: 1769932428 Feb 1 02:53:48 localhost puppet-user[52095]: Total: 0.13 Feb 1 02:53:48 localhost puppet-user[52095]: Version: Feb 1 02:53:48 localhost puppet-user[52095]: Config: 1769932427 Feb 1 02:53:48 localhost puppet-user[52095]: Puppet: 7.10.0 Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}f9d8b60f125f93c01d13e9bc67ee58f1fd06cc57ef5fbe63b5478e0790417593' Feb 1 02:53:48 localhost puppet-user[52204]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.35 seconds Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Feb 1 02:53:48 localhost systemd[1]: libpod-85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80.scope: Deactivated successfully. Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Feb 1 02:53:48 localhost systemd[1]: libpod-85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80.scope: Consumed 2.580s CPU time. Feb 1 02:53:48 localhost podman[51908]: 2026-02-01 07:53:48.383680031 +0000 UTC m=+2.913286364 container died 85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, tcib_managed=true, container_name=container-puppet-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52452]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52457]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Feb 1 02:53:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:48 localhost systemd[1]: var-lib-containers-storage-overlay-c865b61b693adf7af1d8961d294b7f96decf325baf650c1221fd291eb417296d-merged.mount: Deactivated successfully. Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52460]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106 Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Feb 1 02:53:48 localhost podman[52445]: 2026-02-01 07:53:48.497871614 +0000 UTC m=+0.103854651 container cleanup 85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog) Feb 1 02:53:48 localhost systemd[1]: libpod-conmon-85dec7d8f6bf2a736168f1415d4048ab88fdec15cf6311f489ac719dcfbb0a80.scope: Deactivated successfully. Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Feb 1 02:53:48 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52464]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005604212.localdomain Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005604212.novalocal' to 'np0005604212.localdomain' Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52476]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52479]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52486]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52493]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52495]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52497]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52499]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:af:15:58 Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52501]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52503]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Feb 1 02:53:48 localhost ovs-vsctl[52505]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Feb 1 02:53:48 localhost puppet-user[52204]: Notice: Applied catalog in 0.45 seconds Feb 1 02:53:48 localhost puppet-user[52204]: Application: Feb 1 02:53:48 localhost puppet-user[52204]: Initial environment: production Feb 1 02:53:48 localhost puppet-user[52204]: Converged environment: production Feb 1 02:53:48 localhost puppet-user[52204]: Run mode: user Feb 1 02:53:48 localhost puppet-user[52204]: Changes: Feb 1 02:53:48 localhost puppet-user[52204]: Total: 14 Feb 1 02:53:48 localhost puppet-user[52204]: Events: Feb 1 02:53:48 localhost puppet-user[52204]: Success: 14 Feb 1 02:53:48 localhost puppet-user[52204]: Total: 14 Feb 1 02:53:48 localhost puppet-user[52204]: Resources: Feb 1 02:53:48 localhost puppet-user[52204]: Skipped: 12 Feb 1 02:53:48 localhost puppet-user[52204]: Changed: 14 Feb 1 02:53:48 localhost puppet-user[52204]: Out of sync: 14 Feb 1 02:53:48 localhost puppet-user[52204]: Total: 29 Feb 1 02:53:48 localhost puppet-user[52204]: Time: Feb 1 02:53:48 localhost puppet-user[52204]: Exec: 0.02 Feb 1 02:53:48 localhost puppet-user[52204]: Config retrieval: 0.38 Feb 1 02:53:48 localhost puppet-user[52204]: Vs config: 0.39 Feb 1 02:53:48 localhost puppet-user[52204]: Transaction evaluation: 0.45 Feb 1 02:53:48 localhost puppet-user[52204]: Catalog application: 0.45 Feb 1 02:53:48 localhost puppet-user[52204]: Last run: 1769932428 Feb 1 02:53:48 localhost puppet-user[52204]: Total: 0.45 Feb 1 02:53:48 localhost puppet-user[52204]: Version: Feb 1 02:53:48 localhost puppet-user[52204]: Config: 1769932427 Feb 1 02:53:48 localhost puppet-user[52204]: Puppet: 7.10.0 Feb 1 02:53:49 localhost systemd[1]: libpod-d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110.scope: Deactivated successfully. Feb 1 02:53:49 localhost systemd[1]: libpod-d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110.scope: Consumed 3.137s CPU time. Feb 1 02:53:49 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Feb 1 02:53:49 localhost podman[52544]: 2026-02-01 07:53:49.299934533 +0000 UTC m=+0.065221630 container died d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public) Feb 1 02:53:49 localhost systemd[1]: tmp-crun.YMpwg0.mount: Deactivated successfully. Feb 1 02:53:49 localhost podman[52544]: 2026-02-01 07:53:49.383024492 +0000 UTC m=+0.148311589 container cleanup d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, release=1766032510, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=container-puppet-ovn_controller, build-date=2026-01-12T22:36:40Z, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 1 02:53:49 localhost systemd[1]: libpod-conmon-d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110.scope: Deactivated successfully. Feb 1 02:53:49 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 1 02:53:49 localhost systemd[1]: var-lib-containers-storage-overlay-412ecd0722770303c70661441b3031022487ea82d3402d0d51ca72c2eaf9a882-merged.mount: Deactivated successfully. Feb 1 02:53:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d655adb3407eb812adae6d20fdd650994ab6b9776b53cfb30a1db156846e7110-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:49 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Feb 1 02:53:49 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Feb 1 02:53:49 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Feb 1 02:53:49 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Feb 1 02:53:49 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Feb 1 02:53:49 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Feb 1 02:53:50 localhost puppet-user[51320]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3ccd56cc76ec60fa08fd698d282c9c89b1e8c485a00f47d57569ed8f6f8a16e4' Feb 1 02:53:50 localhost puppet-user[51320]: Notice: Applied catalog in 4.67 seconds Feb 1 02:53:50 localhost puppet-user[51320]: Application: Feb 1 02:53:50 localhost puppet-user[51320]: Initial environment: production Feb 1 02:53:50 localhost puppet-user[51320]: Converged environment: production Feb 1 02:53:50 localhost puppet-user[51320]: Run mode: user Feb 1 02:53:50 localhost puppet-user[51320]: Changes: Feb 1 02:53:50 localhost puppet-user[51320]: Total: 183 Feb 1 02:53:50 localhost puppet-user[51320]: Events: Feb 1 02:53:50 localhost puppet-user[51320]: Success: 183 Feb 1 02:53:50 localhost puppet-user[51320]: Total: 183 Feb 1 02:53:50 localhost puppet-user[51320]: Resources: Feb 1 02:53:50 localhost puppet-user[51320]: Changed: 183 Feb 1 02:53:50 localhost puppet-user[51320]: Out of sync: 183 Feb 1 02:53:50 localhost puppet-user[51320]: Skipped: 57 Feb 1 02:53:50 localhost puppet-user[51320]: Total: 487 Feb 1 02:53:50 localhost puppet-user[51320]: Time: Feb 1 02:53:50 localhost puppet-user[51320]: Concat file: 0.00 Feb 1 02:53:50 localhost puppet-user[51320]: Concat fragment: 0.00 Feb 1 02:53:50 localhost puppet-user[51320]: Anchor: 0.00 Feb 1 02:53:50 localhost puppet-user[51320]: File line: 0.00 Feb 1 02:53:50 localhost puppet-user[51320]: Virtlogd config: 0.00 Feb 1 02:53:50 localhost puppet-user[51320]: Virtstoraged config: 0.01 Feb 1 02:53:50 localhost puppet-user[51320]: Virtsecretd config: 0.01 Feb 1 02:53:50 localhost puppet-user[51320]: Virtqemud config: 0.01 Feb 1 02:53:50 localhost puppet-user[51320]: Virtnodedevd config: 0.02 Feb 1 02:53:50 localhost puppet-user[51320]: Virtproxyd config: 0.03 Feb 1 02:53:50 localhost puppet-user[51320]: Exec: 0.03 Feb 1 02:53:50 localhost puppet-user[51320]: Package: 0.03 Feb 1 02:53:50 localhost puppet-user[51320]: File: 0.05 Feb 1 02:53:50 localhost puppet-user[51320]: Augeas: 0.94 Feb 1 02:53:50 localhost puppet-user[51320]: Config retrieval: 1.80 Feb 1 02:53:50 localhost puppet-user[51320]: Last run: 1769932430 Feb 1 02:53:50 localhost puppet-user[51320]: Nova config: 3.31 Feb 1 02:53:50 localhost puppet-user[51320]: Transaction evaluation: 4.65 Feb 1 02:53:50 localhost puppet-user[51320]: Catalog application: 4.67 Feb 1 02:53:50 localhost puppet-user[51320]: Resources: 0.00 Feb 1 02:53:50 localhost puppet-user[51320]: Total: 4.67 Feb 1 02:53:50 localhost puppet-user[51320]: Version: Feb 1 02:53:50 localhost puppet-user[51320]: Config: 1769932424 Feb 1 02:53:50 localhost puppet-user[51320]: Puppet: 7.10.0 Feb 1 02:53:51 localhost systemd[1]: libpod-1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25.scope: Deactivated successfully. Feb 1 02:53:51 localhost systemd[1]: libpod-1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25.scope: Consumed 9.219s CPU time. Feb 1 02:53:51 localhost podman[51176]: 2026-02-01 07:53:51.94497806 +0000 UTC m=+10.652637899 container died 1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, container_name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, release=1766032510, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team) Feb 1 02:53:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:52 localhost systemd[1]: var-lib-containers-storage-overlay-f95e8251fb179e771f128b38ef4d608f2b943f904632add746cd54aaa3444e8d-merged.mount: Deactivated successfully. Feb 1 02:53:52 localhost podman[52655]: 2026-02-01 07:53:52.938892567 +0000 UTC m=+0.985409800 container cleanup 1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, architecture=x86_64, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 02:53:52 localhost systemd[1]: libpod-conmon-1614df26c90b4b5becdfed5aaf95d06822da59d3916f0992ce7a1dcd44250e25.scope: Deactivated successfully. Feb 1 02:53:52 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 02:53:52 localhost podman[52205]: 2026-02-01 07:53:46.148003221 +0000 UTC m=+0.033628961 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 1 02:53:53 localhost podman[52717]: 2026-02-01 07:53:53.181030301 +0000 UTC m=+0.080982848 container create 219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.created=2026-01-12T22:57:35Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-server, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=container-puppet-neutron) Feb 1 02:53:53 localhost systemd[1]: Started libpod-conmon-219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f.scope. Feb 1 02:53:53 localhost systemd[1]: Started libcrun container. Feb 1 02:53:53 localhost podman[52717]: 2026-02-01 07:53:53.137829001 +0000 UTC m=+0.037781538 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 1 02:53:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28b6472dd819f429aafaa31148f5831e61b514c6fccac594b3d504524c4fb8a5/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 1 02:53:53 localhost podman[52717]: 2026-02-01 07:53:53.253711445 +0000 UTC m=+0.153664002 container init 219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:57:35Z, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-server, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, container_name=container-puppet-neutron, build-date=2026-01-12T22:57:35Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510) Feb 1 02:53:53 localhost podman[52717]: 2026-02-01 07:53:53.264056849 +0000 UTC m=+0.164009406 container start 219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:57:35Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-neutron-server, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-server-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:57:35Z, container_name=container-puppet-neutron, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 1 02:53:53 localhost podman[52717]: 2026-02-01 07:53:53.264385269 +0000 UTC m=+0.164337866 container attach 219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, name=rhosp-rhel9/openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:57:35Z, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=container-puppet-neutron, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:57:35Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1) Feb 1 02:53:55 localhost puppet-user[52748]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Feb 1 02:53:55 localhost puppet-user[52748]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 02:53:55 localhost puppet-user[52748]: (file: /etc/puppet/hiera.yaml) Feb 1 02:53:55 localhost puppet-user[52748]: Warning: Undefined variable '::deploy_config_name'; Feb 1 02:53:55 localhost puppet-user[52748]: (file & line not available) Feb 1 02:53:55 localhost puppet-user[52748]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 02:53:55 localhost puppet-user[52748]: (file & line not available) Feb 1 02:53:55 localhost puppet-user[52748]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Feb 1 02:53:56 localhost puppet-user[52748]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.65 seconds Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Feb 1 02:53:56 localhost puppet-user[52748]: Notice: Applied catalog in 0.53 seconds Feb 1 02:53:56 localhost puppet-user[52748]: Application: Feb 1 02:53:56 localhost puppet-user[52748]: Initial environment: production Feb 1 02:53:56 localhost puppet-user[52748]: Converged environment: production Feb 1 02:53:56 localhost puppet-user[52748]: Run mode: user Feb 1 02:53:56 localhost puppet-user[52748]: Changes: Feb 1 02:53:56 localhost puppet-user[52748]: Total: 33 Feb 1 02:53:56 localhost puppet-user[52748]: Events: Feb 1 02:53:56 localhost puppet-user[52748]: Success: 33 Feb 1 02:53:56 localhost puppet-user[52748]: Total: 33 Feb 1 02:53:56 localhost puppet-user[52748]: Resources: Feb 1 02:53:56 localhost puppet-user[52748]: Skipped: 21 Feb 1 02:53:56 localhost puppet-user[52748]: Changed: 33 Feb 1 02:53:56 localhost puppet-user[52748]: Out of sync: 33 Feb 1 02:53:56 localhost puppet-user[52748]: Total: 155 Feb 1 02:53:56 localhost puppet-user[52748]: Time: Feb 1 02:53:56 localhost puppet-user[52748]: Resources: 0.00 Feb 1 02:53:56 localhost puppet-user[52748]: Ovn metadata agent config: 0.02 Feb 1 02:53:56 localhost puppet-user[52748]: Neutron config: 0.44 Feb 1 02:53:56 localhost puppet-user[52748]: Transaction evaluation: 0.52 Feb 1 02:53:56 localhost puppet-user[52748]: Catalog application: 0.53 Feb 1 02:53:56 localhost puppet-user[52748]: Config retrieval: 0.73 Feb 1 02:53:56 localhost puppet-user[52748]: Last run: 1769932436 Feb 1 02:53:56 localhost puppet-user[52748]: Total: 0.53 Feb 1 02:53:56 localhost puppet-user[52748]: Version: Feb 1 02:53:56 localhost puppet-user[52748]: Config: 1769932435 Feb 1 02:53:56 localhost puppet-user[52748]: Puppet: 7.10.0 Feb 1 02:53:57 localhost systemd[1]: libpod-219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f.scope: Deactivated successfully. Feb 1 02:53:57 localhost systemd[1]: libpod-219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f.scope: Consumed 3.839s CPU time. Feb 1 02:53:57 localhost podman[52717]: 2026-02-01 07:53:57.182406778 +0000 UTC m=+4.082359305 container died 219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:57:35Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server) Feb 1 02:53:57 localhost systemd[1]: tmp-crun.OGumif.mount: Deactivated successfully. Feb 1 02:53:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f-userdata-shm.mount: Deactivated successfully. Feb 1 02:53:57 localhost systemd[1]: var-lib-containers-storage-overlay-28b6472dd819f429aafaa31148f5831e61b514c6fccac594b3d504524c4fb8a5-merged.mount: Deactivated successfully. Feb 1 02:53:57 localhost podman[52860]: 2026-02-01 07:53:57.370941266 +0000 UTC m=+0.174827394 container cleanup 219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-server-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-neutron, org.opencontainers.image.created=2026-01-12T22:57:35Z, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp-rhel9/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2026-01-12T22:57:35Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 1 02:53:57 localhost systemd[1]: libpod-conmon-219a4bbb0e978c44e6bc86e26febc4c66a42160abfe12751ef825218c5405c1f.scope: Deactivated successfully. Feb 1 02:53:57 localhost python3[50955]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604212 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604212', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 1 02:53:58 localhost python3[52914]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:53:59 localhost python3[52946]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:53:59 localhost python3[52996]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:54:00 localhost python3[53039]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932439.4741616-84194-81765220579721/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:00 localhost python3[53101]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:54:00 localhost python3[53144]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932440.2220495-84194-213949481562842/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:01 localhost python3[53208]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:54:01 localhost python3[53251]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932441.1370018-84248-168181227065298/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:02 localhost python3[53313]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:54:02 localhost python3[53356]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932442.1111171-84279-47140110038089/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:04 localhost python3[53386]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:54:04 localhost systemd[1]: Reloading. Feb 1 02:54:04 localhost systemd-rc-local-generator[53408]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:54:04 localhost systemd-sysv-generator[53412]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:54:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:54:04 localhost systemd[1]: Reloading. Feb 1 02:54:04 localhost systemd-sysv-generator[53449]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:54:04 localhost systemd-rc-local-generator[53445]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:54:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:54:04 localhost systemd[1]: Starting TripleO Container Shutdown... Feb 1 02:54:04 localhost systemd[1]: Finished TripleO Container Shutdown. Feb 1 02:54:05 localhost python3[53509]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:54:05 localhost python3[53552]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932444.844882-84342-52598709297967/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:06 localhost python3[53614]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:54:06 localhost python3[53657]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932445.781777-84356-217903422456602/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:07 localhost python3[53687]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:54:07 localhost systemd[1]: Reloading. Feb 1 02:54:07 localhost systemd-rc-local-generator[53713]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:54:07 localhost systemd-sysv-generator[53718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:54:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:54:07 localhost systemd[1]: Reloading. Feb 1 02:54:07 localhost systemd-rc-local-generator[53750]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:54:07 localhost systemd-sysv-generator[53754]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:54:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:54:07 localhost systemd[1]: Starting Create netns directory... Feb 1 02:54:07 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 02:54:07 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 02:54:07 localhost systemd[1]: Finished Create netns directory. Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 60dc9caaeb1b9ec4a6ba094f0fd24dbd Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: da9a0dc7b40588672419e3ce10063e21 Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: a46ef4c25933bba0e125120095b56cb6 Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 2de9c6a2ee669114248af0484a5abc8a Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 56f18c3ee04e8cd5761527c0820290d2 Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 56f18c3ee04e8cd5761527c0820290d2 Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 4dacb3799b36b0da29dc6587bf4940e2 Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:08 localhost python3[53781]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 9ec539c069b98a16ced7663e9b12641d Feb 1 02:54:09 localhost python3[53839]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 1 02:54:09 localhost podman[53880]: 2026-02-01 07:54:09.882246182 +0000 UTC m=+0.075166171 container create 959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 1 02:54:09 localhost systemd[1]: Started libpod-conmon-959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d.scope. Feb 1 02:54:09 localhost systemd[1]: Started libcrun container. Feb 1 02:54:09 localhost podman[53880]: 2026-02-01 07:54:09.845355143 +0000 UTC m=+0.038275142 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 1 02:54:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e8785609ca774d4a308ace660b858cc96013453cf69257a80c4106879c3cc42/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 1 02:54:09 localhost podman[53880]: 2026-02-01 07:54:09.96264462 +0000 UTC m=+0.155564619 container init 959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 02:54:09 localhost podman[53880]: 2026-02-01 07:54:09.973151268 +0000 UTC m=+0.166071257 container start 959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z) Feb 1 02:54:09 localhost podman[53880]: 2026-02-01 07:54:09.974231562 +0000 UTC m=+0.167151641 container attach 959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 02:54:09 localhost systemd[1]: libpod-959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d.scope: Deactivated successfully. Feb 1 02:54:09 localhost podman[53880]: 2026-02-01 07:54:09.982185823 +0000 UTC m=+0.175105832 container died 959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 02:54:10 localhost podman[53900]: 2026-02-01 07:54:10.077822884 +0000 UTC m=+0.081465352 container cleanup 959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 1 02:54:10 localhost systemd[1]: libpod-conmon-959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d.scope: Deactivated successfully. Feb 1 02:54:10 localhost python3[53839]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Feb 1 02:54:10 localhost podman[53975]: 2026-02-01 07:54:10.558257686 +0000 UTC m=+0.084739711 container create 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, container_name=metrics_qdr) Feb 1 02:54:10 localhost systemd[1]: Started libpod-conmon-5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.scope. Feb 1 02:54:10 localhost systemd[1]: Started libcrun container. Feb 1 02:54:10 localhost podman[53975]: 2026-02-01 07:54:10.52147192 +0000 UTC m=+0.047953925 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 1 02:54:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38608c39b920110534238513fb52c11220d545a8d3383f5c9cf411254a119b53/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 1 02:54:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38608c39b920110534238513fb52c11220d545a8d3383f5c9cf411254a119b53/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 1 02:54:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:54:10 localhost podman[53975]: 2026-02-01 07:54:10.655475435 +0000 UTC m=+0.181957520 container init 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 02:54:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:54:10 localhost podman[53975]: 2026-02-01 07:54:10.695693334 +0000 UTC m=+0.222175369 container start 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step1, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 02:54:10 localhost python3[53839]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=60dc9caaeb1b9ec4a6ba094f0fd24dbd --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 1 02:54:10 localhost podman[53997]: 2026-02-01 07:54:10.826892514 +0000 UTC m=+0.118661720 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step1) Feb 1 02:54:10 localhost systemd[1]: var-lib-containers-storage-overlay-6e8785609ca774d4a308ace660b858cc96013453cf69257a80c4106879c3cc42-merged.mount: Deactivated successfully. Feb 1 02:54:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-959c05f358dfd7bdb065aeb88c7d6cf85123b6db118b64900492144dd018a43d-userdata-shm.mount: Deactivated successfully. Feb 1 02:54:11 localhost podman[53997]: 2026-02-01 07:54:11.044660249 +0000 UTC m=+0.336429435 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vcs-type=git) Feb 1 02:54:11 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:54:11 localhost python3[54067]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:11 localhost python3[54083]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 02:54:12 localhost python3[54144]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932451.7675085-84560-176408271599986/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:12 localhost python3[54160]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 02:54:12 localhost systemd[1]: Reloading. Feb 1 02:54:12 localhost systemd-sysv-generator[54191]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:54:12 localhost systemd-rc-local-generator[54186]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:54:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:54:13 localhost python3[54212]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 02:54:13 localhost systemd[1]: Reloading. Feb 1 02:54:13 localhost systemd-sysv-generator[54245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 02:54:13 localhost systemd-rc-local-generator[54241]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 02:54:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 02:54:14 localhost systemd[1]: Starting metrics_qdr container... Feb 1 02:54:14 localhost systemd[1]: Started metrics_qdr container. Feb 1 02:54:14 localhost python3[54293]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:16 localhost python3[54414]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005604212 step=1 update_config_hash_only=False Feb 1 02:54:16 localhost python3[54430]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:54:16 localhost python3[54446]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 1 02:54:35 localhost sshd[54447]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:54:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:54:41 localhost systemd[1]: tmp-crun.6rf2TU.mount: Deactivated successfully. Feb 1 02:54:41 localhost podman[54449]: 2026-02-01 07:54:41.739448361 +0000 UTC m=+0.093868029 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 02:54:41 localhost podman[54449]: 2026-02-01 07:54:41.989153785 +0000 UTC m=+0.343573383 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 02:54:42 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:55:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:55:12 localhost podman[54556]: 2026-02-01 07:55:12.728795507 +0000 UTC m=+0.085855046 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 02:55:12 localhost podman[54556]: 2026-02-01 07:55:12.920197252 +0000 UTC m=+0.277256761 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git) Feb 1 02:55:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:55:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:55:43 localhost podman[54586]: 2026-02-01 07:55:43.726462534 +0000 UTC m=+0.087077502 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Feb 1 02:55:43 localhost podman[54586]: 2026-02-01 07:55:43.932099891 +0000 UTC m=+0.292714889 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 02:55:43 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:56:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:56:14 localhost podman[54692]: 2026-02-01 07:56:14.733048082 +0000 UTC m=+0.085001239 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 02:56:14 localhost podman[54692]: 2026-02-01 07:56:14.913383312 +0000 UTC m=+0.265336449 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 02:56:14 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:56:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:56:45 localhost systemd[1]: tmp-crun.40RC1z.mount: Deactivated successfully. Feb 1 02:56:45 localhost podman[54735]: 2026-02-01 07:56:45.317944139 +0000 UTC m=+0.103023746 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1766032510, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 1 02:56:45 localhost podman[54735]: 2026-02-01 07:56:45.516547033 +0000 UTC m=+0.301626620 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 1 02:56:45 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:57:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:57:15 localhost podman[54826]: 2026-02-01 07:57:15.698739078 +0000 UTC m=+0.064968002 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, config_id=tripleo_step1, managed_by=tripleo_ansible) Feb 1 02:57:15 localhost podman[54826]: 2026-02-01 07:57:15.89301624 +0000 UTC m=+0.259245174 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 02:57:15 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:57:19 localhost sshd[54855]: main: sshd: ssh-rsa algorithm is disabled Feb 1 02:57:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:57:46 localhost podman[54857]: 2026-02-01 07:57:46.7107714 +0000 UTC m=+0.075756979 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 02:57:46 localhost podman[54857]: 2026-02-01 07:57:46.934371602 +0000 UTC m=+0.299357181 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1) Feb 1 02:57:46 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:58:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:58:17 localhost podman[54961]: 2026-02-01 07:58:17.718912861 +0000 UTC m=+0.081333494 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com) Feb 1 02:58:17 localhost podman[54961]: 2026-02-01 07:58:17.919196091 +0000 UTC m=+0.281616674 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container) Feb 1 02:58:17 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:58:40 localhost ceph-osd[31431]: osd.1 pg_epoch: 19 pg[2.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [2,1,3] r=1 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:58:40 localhost ceph-osd[31431]: osd.1 pg_epoch: 20 pg[3.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [1,2,0] r=0 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:58:41 localhost ceph-osd[31431]: osd.1 pg_epoch: 21 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [1,2,0] r=0 lpr=20 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:58:43 localhost ceph-osd[31431]: osd.1 pg_epoch: 22 pg[4.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [3,5,1] r=2 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:58:44 localhost ceph-osd[32376]: osd.4 pg_epoch: 24 pg[5.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [4,3,2] r=0 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:58:45 localhost ceph-osd[32376]: osd.4 pg_epoch: 25 pg[5.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [4,3,2] r=0 lpr=24 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:58:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:58:48 localhost podman[55002]: 2026-02-01 07:58:48.293267823 +0000 UTC m=+0.097364131 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container) Feb 1 02:58:48 localhost podman[55002]: 2026-02-01 07:58:48.472426841 +0000 UTC m=+0.276523189 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 02:58:48 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:59:01 localhost ceph-osd[31431]: osd.1 pg_epoch: 30 pg[6.0( empty local-lis/les=0/0 n=0 ec=30/30 lis/c=0/0 les/c/f=0/0/0 sis=30) [0,5,1] r=2 lpr=30 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:04 localhost ceph-osd[31431]: osd.1 pg_epoch: 32 pg[7.0( empty local-lis/les=0/0 n=0 ec=32/32 lis/c=0/0 les/c/f=0/0/0 sis=32) [5,1,3] r=1 lpr=32 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:59:18 localhost systemd[1]: tmp-crun.i24ebQ.mount: Deactivated successfully. Feb 1 02:59:18 localhost podman[55139]: 2026-02-01 07:59:18.738766253 +0000 UTC m=+0.095291069 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, architecture=x86_64, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 02:59:18 localhost podman[55139]: 2026-02-01 07:59:18.970589762 +0000 UTC m=+0.327114608 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 02:59:18 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:59:19 localhost ceph-osd[31431]: osd.1 pg_epoch: 36 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=36 pruub=9.097184181s) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active pruub 1122.583129883s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,3], acting [2,1,3] -> [2,1,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:19 localhost ceph-osd[31431]: osd.1 pg_epoch: 36 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=36 pruub=9.093715668s) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1122.583129883s@ mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.18( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.17( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.16( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.15( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.14( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.11( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.12( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.10( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.e( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.13( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.f( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.d( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.c( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.b( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.a( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.9( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.1( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.4( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.6( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.5( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.2( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.3( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.7( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.19( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.8( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.1b( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.1a( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.1d( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.1e( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.1c( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:20 localhost ceph-osd[31431]: osd.1 pg_epoch: 37 pg[2.1f( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=1 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:21 localhost ceph-osd[31431]: osd.1 pg_epoch: 38 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=8.297445297s) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active pruub 1123.801147461s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,0], acting [1,2,0] -> [1,2,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:21 localhost ceph-osd[31431]: osd.1 pg_epoch: 38 pg[4.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=38 pruub=10.085723877s) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active pruub 1125.589965820s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,1], acting [3,5,1] -> [3,5,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:21 localhost ceph-osd[31431]: osd.1 pg_epoch: 38 pg[4.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=38 pruub=10.082015038s) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1125.589965820s@ mbc={}] state: transitioning to Stray Feb 1 02:59:21 localhost ceph-osd[31431]: osd.1 pg_epoch: 38 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=8.297445297s) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.801147461s@ mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.16( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.17( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.14( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.15( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.12( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.13( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.10( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.11( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.1e( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.19( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.9( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.17( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.14( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.15( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.12( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.13( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.b( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.10( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.11( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.8( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.16( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.a( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.d( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.f( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.6( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.8( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.7( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.2( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.7( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.1( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.5( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.5( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.3( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.c( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.4( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.e( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.4( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.3( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.2( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.9( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.1f( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.18( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.1c( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.6( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.1d( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.1a( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.1b( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.18( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[4.19( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=2 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.0( empty local-lis/les=38/39 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.15( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:22 localhost ceph-osd[31431]: osd.1 pg_epoch: 39 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=0 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:23 localhost ceph-osd[31431]: osd.1 pg_epoch: 40 pg[6.0( empty local-lis/les=30/31 n=0 ec=30/30 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=10.156845093s) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active pruub 1127.687133789s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,1], acting [0,5,1] -> [0,5,1], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:23 localhost ceph-osd[32376]: osd.4 pg_epoch: 40 pg[5.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=10.057350159s) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active pruub 1123.616210938s@ mbc={}] start_peering_interval up [4,3,2] -> [4,3,2], acting [4,3,2] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:23 localhost ceph-osd[31431]: osd.1 pg_epoch: 40 pg[6.0( empty local-lis/les=30/31 n=0 ec=30/30 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=10.153734207s) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.687133789s@ mbc={}] state: transitioning to Stray Feb 1 02:59:23 localhost ceph-osd[32376]: osd.4 pg_epoch: 40 pg[5.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=10.057350159s) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1123.616210938s@ mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.9( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.7( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.6( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.8( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.5( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.3( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.2( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.4( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.18( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.10( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.12( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.11( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.15( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.16( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.14( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.13( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.19( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.17( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.11( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.12( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.10( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.1c( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.17( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.16( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.15( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.14( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.b( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.13( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.f( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.a( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.8( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.4( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.d( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.9( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.2( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.5( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.1( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.7( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.6( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.3( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.c( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.1d( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.1e( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.18( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.1f( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.19( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.1b( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.e( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[31431]: osd.1 pg_epoch: 41 pg[6.1a( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=2 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.0( empty local-lis/les=40/41 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:24 localhost ceph-osd[32376]: osd.4 pg_epoch: 41 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=0 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:25 localhost ceph-osd[31431]: osd.1 pg_epoch: 42 pg[7.0( v 34'39 (0'0,34'39] local-lis/les=32/33 n=22 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=10.782591820s) [5,1,3] r=1 lpr=42 pi=[32,42)/1 luod=0'0 lua=34'37 crt=34'39 lcod 34'38 mlcod 0'0 active pruub 1130.398071289s@ mbc={}] start_peering_interval up [5,1,3] -> [5,1,3], acting [5,1,3] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:25 localhost ceph-osd[31431]: osd.1 pg_epoch: 42 pg[7.0( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=10.780807495s) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 lcod 34'38 mlcod 0'0 unknown NOTIFY pruub 1130.398071289s@ mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.0 scrub starts Feb 1 02:59:26 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.0 scrub ok Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.d( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.2( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.7( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.6( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.3( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.4( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.5( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.c( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.f( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.e( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.8( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.9( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.a( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:26 localhost ceph-osd[31431]: osd.1 pg_epoch: 43 pg[7.b( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=1 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 1 02:59:27 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.1a deep-scrub starts Feb 1 02:59:27 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.1a deep-scrub ok Feb 1 02:59:28 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.1b scrub starts Feb 1 02:59:28 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.1b scrub ok Feb 1 02:59:31 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.0 scrub starts Feb 1 02:59:31 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.0 scrub ok Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.12( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,3,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.742973328s) [4,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.620849609s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.742973328s) [4,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.620849609s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743882179s) [2,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.621948242s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743797302s) [2,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.621948242s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.742934227s) [2,4,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.621459961s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,3], acting [4,3,2] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.742839813s) [2,4,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.621459961s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.18( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,2,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.742566109s) [0,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.621704102s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,4], acting [4,3,2] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691240311s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542236328s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,4], acting [1,2,0] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691153526s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542236328s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.661355019s) [1,0,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.512573242s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.661355019s) [1,0,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.512573242s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.698763847s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550170898s@ mbc={}] start_peering_interval up [1,2,0] -> [2,4,0], acting [1,2,0] -> [2,4,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.12( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.742480278s) [0,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.621704102s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.698655128s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550170898s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738789558s) [3,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.618164062s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738746643s) [3,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.618164062s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739498138s) [2,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.619140625s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739463806s) [2,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.619140625s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739841461s) [5,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.619750977s@ mbc={}] start_peering_interval up [4,3,2] -> [5,4,0], acting [4,3,2] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739767075s) [5,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.619750977s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.14( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663545609s) [2,4,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.515869141s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,0], acting [2,1,3] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739418983s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.619628906s@ mbc={}] start_peering_interval up [4,3,2] -> [0,1,2], acting [4,3,2] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739386559s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.619628906s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.14( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663496971s) [2,4,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.515869141s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663485527s) [1,5,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.515991211s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663485527s) [1,5,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.515991211s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.688691139s) [2,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.541381836s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,1], acting [1,2,0] -> [2,3,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738544464s) [5,3,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.618896484s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,4], acting [4,3,2] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.688628197s) [2,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.541381836s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.8( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738512993s) [5,3,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.618896484s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737252235s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.617797852s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.686885834s) [2,1,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540283203s@ mbc={}] start_peering_interval up [1,2,0] -> [2,1,0], acting [1,2,0] -> [2,1,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663592339s) [1,5,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.517089844s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737221718s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.617797852s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663592339s) [1,5,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.517089844s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.687005043s) [1,5,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540649414s@ mbc={}] start_peering_interval up [1,2,0] -> [1,5,0], acting [1,2,0] -> [1,5,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.687005043s) [1,5,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.540649414s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738592148s) [3,5,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.619262695s@ mbc={}] start_peering_interval up [4,3,2] -> [3,5,4], acting [4,3,2] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738560677s) [3,5,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.619262695s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739671707s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.620605469s@ mbc={}] start_peering_interval up [4,3,2] -> [1,0,5], acting [4,3,2] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739607811s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.620605469s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.662919044s) [1,5,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.517211914s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,0], acting [2,1,3] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.662919044s) [1,5,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.517211914s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738235474s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.619262695s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,1], acting [4,3,2] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738201141s) [0,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.619262695s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.686791420s) [2,1,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540283203s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663281441s) [1,3,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.518188477s@ mbc={}] start_peering_interval up [2,1,3] -> [1,3,2], acting [2,1,3] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663281441s) [1,3,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.518188477s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736407280s) [2,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.617797852s@ mbc={}] start_peering_interval up [4,3,2] -> [2,1,3], acting [4,3,2] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736376762s) [2,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.617797852s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.4( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,3,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.740015984s) [2,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.621582031s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736759186s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.618408203s@ mbc={}] start_peering_interval up [4,3,2] -> [5,0,1], acting [4,3,2] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736974716s) [2,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.618652344s@ mbc={}] start_peering_interval up [4,3,2] -> [2,3,1], acting [4,3,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736689568s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.618408203s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736915588s) [2,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.618652344s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737171173s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.619018555s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737057686s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.619018555s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.660619736s) [4,3,2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.516723633s@ mbc={}] start_peering_interval up [2,1,3] -> [4,3,2], acting [2,1,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738121986s) [4,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.620117188s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.660544395s) [4,3,2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.516723633s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738121986s) [4,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.620117188s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738368988s) [1,3,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.620483398s@ mbc={}] start_peering_interval up [4,3,2] -> [1,3,5], acting [4,3,2] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.662909508s) [0,2,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.519287109s@ mbc={}] start_peering_interval up [2,1,3] -> [0,2,4], acting [2,1,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.11( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.662739754s) [0,2,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.519287109s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738303185s) [1,3,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.620483398s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739311218s) [2,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.621582031s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.f( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,2,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.10( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,0,5] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.662454605s) [3,1,5] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.519531250s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738745689s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.621948242s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738699913s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.621948242s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734536171s) [4,0,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.618041992s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,2], acting [4,3,2] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.9( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734536171s) [4,0,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.618041992s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.8( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.1b( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734106064s) [2,0,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.618530273s@ mbc={}] start_peering_interval up [4,3,2] -> [2,0,4], acting [4,3,2] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734065056s) [2,0,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.618530273s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.1c( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,2,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.733152390s) [5,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.618286133s@ mbc={}] start_peering_interval up [4,3,2] -> [5,1,3], acting [4,3,2] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738267899s) [1,2,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595703125s@ mbc={}] start_peering_interval up [0,5,1] -> [1,2,0], acting [0,5,1] -> [1,2,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738267899s) [1,2,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.595703125s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.662367821s) [3,1,5] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.519531250s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693284988s) [0,5,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.551391602s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,4], acting [1,2,0] -> [0,5,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693203926s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.551635742s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.18( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.684130669s) [2,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542480469s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.19( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.685333252s) [1,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543823242s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.18( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.684045792s) [2,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542480469s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693230629s) [0,5,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.551391602s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.19( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.685333252s) [1,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.543823242s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743325233s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.602172852s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.733088493s) [5,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.618286133s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735277176s) [3,4,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.620727539s@ mbc={}] start_peering_interval up [4,3,2] -> [3,4,5], acting [4,3,2] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743240356s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.602172852s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735213280s) [3,4,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.620727539s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691624641s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550781250s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692654610s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.551635742s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735535622s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.621337891s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691474915s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550781250s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.688693047s) [2,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548217773s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735500336s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.621337891s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.688640594s) [2,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.548217773s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.19( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736072540s) [5,3,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595703125s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,1], acting [0,5,1] -> [5,3,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.660343170s) [4,2,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.520019531s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734414101s) [4,5,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.620239258s@ mbc={}] start_peering_interval up [4,3,2] -> [4,5,0], acting [4,3,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691600800s) [1,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.551391602s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,3], acting [1,2,0] -> [1,2,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.659190178s) [4,5,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.519042969s@ mbc={}] start_peering_interval up [2,1,3] -> [4,5,0], acting [2,1,3] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691600800s) [1,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.551391602s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.659079552s) [4,5,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.519042969s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734414101s) [4,5,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.620239258s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.19( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736010551s) [5,3,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595703125s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.686989784s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.547241211s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,4], acting [3,5,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.18( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735318184s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595703125s@ mbc={}] start_peering_interval up [0,5,1] -> [0,2,4], acting [0,5,1] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.18( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735267639s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595703125s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.686934471s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.547241211s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734983444s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.620971680s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.660288811s) [4,2,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.520019531s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.687771797s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548583984s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.661761284s) [1,2,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.522216797s@ mbc={}] start_peering_interval up [2,1,3] -> [1,2,3], acting [2,1,3] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679389954s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540283203s@ mbc={}] start_peering_interval up [1,2,0] -> [4,3,2], acting [1,2,0] -> [4,3,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736062050s) [3,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.622192383s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.661761284s) [1,2,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.522216797s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679179192s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540283203s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.687470436s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.548583984s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734290123s) [4,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595703125s@ mbc={}] start_peering_interval up [0,5,1] -> [4,5,3], acting [0,5,1] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.657554626s) [2,4,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.518920898s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,3], acting [2,1,3] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734220505s) [4,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595703125s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734935760s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.620971680s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.657516479s) [2,4,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.518920898s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.686522484s) [1,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548217773s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.736007690s) [3,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.622192383s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.686522484s) [1,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.548217773s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.734184265s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595703125s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.731425285s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.617797852s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.656915665s) [3,2,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.518676758s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735854149s) [3,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.622192383s@ mbc={}] start_peering_interval up [4,3,2] -> [3,1,5], acting [4,3,2] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.656399727s) [3,2,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.518676758s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.685758591s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548217773s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.688241005s) [3,1,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550781250s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,5], acting [1,2,0] -> [3,1,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.731391907s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.617797852s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.735760689s) [3,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.622192383s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.1d( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,5,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.1a( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.1e( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.12( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,2,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.685693741s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.548217773s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.732843399s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595336914s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.733616829s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595703125s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.732141495s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595336914s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.804337502s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.667846680s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679298401s) [4,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542846680s@ mbc={}] start_peering_interval up [1,2,0] -> [4,2,3], acting [1,2,0] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.676648140s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540161133s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,3], acting [1,2,0] -> [4,5,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.687260628s) [3,1,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550781250s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679255486s) [4,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542846680s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.8( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.655578613s) [2,1,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.519165039s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,0], acting [2,1,3] -> [2,1,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.676469803s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540161133s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.8( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.655445099s) [2,1,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.519165039s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.652745247s) [4,2,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.516723633s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,3], acting [2,1,3] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.652665138s) [4,2,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.516723633s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.731309891s) [3,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595336914s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,4], acting [0,5,1] -> [3,2,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.1d( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.731245041s) [3,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595336914s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.676610947s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.541015625s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.676547050s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.541015625s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.7( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.653259277s) [5,1,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.517822266s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.1f( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679172516s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543823242s@ mbc={}] start_peering_interval up [3,5,1] -> [4,2,0], acting [3,5,1] -> [4,2,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.7( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.653199196s) [5,1,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.517822266s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.3( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.730762482s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595458984s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679028511s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543823242s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678897858s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543823242s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,0], acting [3,5,1] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678837776s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543823242s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.685895920s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550903320s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.802900314s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.667846680s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.685853004s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550903320s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.3( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.730588913s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595458984s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.7( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.729733467s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595336914s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.3( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.651927948s) [5,3,1] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.517578125s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,1], acting [2,1,3] -> [5,3,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.801778793s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.667480469s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.801702499s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.667480469s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.7( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.729681015s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595336914s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.674473763s) [2,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540527344s@ mbc={}] start_peering_interval up [1,2,0] -> [2,0,4], acting [1,2,0] -> [2,0,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.5( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.680169106s) [5,1,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.546386719s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,0], acting [3,5,1] -> [5,1,0], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.674416542s) [2,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540527344s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.1( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.2( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.651823997s) [5,0,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.518310547s@ mbc={}] start_peering_interval up [2,1,3] -> [5,0,4], acting [2,1,3] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.4( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679829597s) [0,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.546264648s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.2( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.651769638s) [5,0,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.518310547s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.5( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.680092812s) [5,1,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.546386719s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.6( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.728996277s) [3,5,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595458984s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,1], acting [0,5,1] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.6( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.728935242s) [3,5,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595458984s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.4( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679499626s) [0,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.546264648s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.655325890s) [1,0,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.522338867s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.3( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.651872635s) [5,3,1] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.517578125s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.3( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.676604271s) [1,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543701172s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.675398827s) [3,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542480469s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.655325890s) [1,0,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.522338867s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.3( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.676604271s) [1,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.543701172s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.675343513s) [3,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542480469s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.799418449s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.666992188s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.652235031s) [3,1,2] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.520019531s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,2], acting [2,1,3] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672944069s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540649414s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,4], acting [1,2,0] -> [5,3,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.799221992s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.666992188s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.652122498s) [3,1,2] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.520019531s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672719955s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540649414s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.2( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.675653458s) [1,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543701172s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.2( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.675653458s) [1,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.543701172s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.799329758s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.667480469s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.799201965s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.667480469s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.673722267s) [3,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542114258s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.650362968s) [3,1,5] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.518798828s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.673650742s) [3,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542114258s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.2( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.726444244s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594970703s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.2( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.726346970s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594970703s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.649998665s) [3,1,5] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.518798828s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.7( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678986549s) [0,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.547851562s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.7( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678946495s) [0,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.547851562s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.726013184s) [1,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594970703s@ mbc={}] start_peering_interval up [0,5,1] -> [1,5,3], acting [0,5,1] -> [1,5,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.650002480s) [3,4,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.518676758s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,5], acting [2,1,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.726013184s) [1,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.594970703s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.5( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.725867271s) [5,1,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594970703s@ mbc={}] start_peering_interval up [0,5,1] -> [5,1,0], acting [0,5,1] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.5( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.725825310s) [5,1,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594970703s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.649538040s) [3,4,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.518676758s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.802185059s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.671508789s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.671245575s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540649414s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.801973343s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.671508789s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.671118736s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540649414s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.15( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.6( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678195953s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548095703s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.6( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678088188s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.548095703s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672805786s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542846680s@ mbc={}] start_peering_interval up [1,2,0] -> [4,0,5], acting [1,2,0] -> [4,0,5], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.648633003s) [3,5,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.518798828s@ mbc={}] start_peering_interval up [2,1,3] -> [3,5,4], acting [2,1,3] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.4( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.724788666s) [3,2,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594970703s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.648594856s) [3,5,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.518798828s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.4( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.724735260s) [3,2,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594970703s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672685623s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542846680s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.673307419s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543701172s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,5], acting [3,5,1] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.673259735s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543701172s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.724437714s) [2,3,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594970703s@ mbc={}] start_peering_interval up [0,5,1] -> [2,3,1], acting [0,5,1] -> [2,3,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.724395752s) [2,3,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594970703s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.14( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.796977997s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.667846680s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.796900749s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.667846680s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679672241s) [3,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550781250s@ mbc={}] start_peering_interval up [1,2,0] -> [3,5,1], acting [1,2,0] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.679602623s) [3,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550781250s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672303200s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543579102s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,1], acting [3,5,1] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672332764s) [2,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543579102s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672245026s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543579102s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678818703s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550170898s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672286987s) [2,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543579102s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678763390s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550170898s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.724052429s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595703125s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678878784s) [5,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550659180s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.723987579s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595703125s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.678826332s) [5,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550659180s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.722813606s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594848633s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.722742081s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594848633s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.6( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.645437241s) [1,0,5] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.517700195s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,5], acting [2,1,3] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.645437241s) [1,0,5] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.517700195s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.8( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.722151756s) [2,1,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594726562s@ mbc={}] start_peering_interval up [0,5,1] -> [2,1,3], acting [0,5,1] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.8( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.722094536s) [2,1,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594726562s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.674092293s) [2,0,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.546630859s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,1], acting [3,5,1] -> [2,0,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.798812866s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.671142578s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.669313431s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542358398s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.669242859s) [5,3,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542358398s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.673343658s) [2,0,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.546630859s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.1a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.797634125s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.671142578s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.9( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.721547127s) [0,1,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.595703125s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,5], acting [0,5,1] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.9( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.721485138s) [0,1,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.595703125s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.669459343s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543701172s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,4], acting [3,5,1] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.8( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.673889160s) [1,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548461914s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.642183304s) [3,4,2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.516723633s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,2], acting [2,1,3] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.669260025s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543701172s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.8( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.673889160s) [1,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.548461914s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.720322609s) [5,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594848633s@ mbc={}] start_peering_interval up [0,5,1] -> [5,0,4], acting [0,5,1] -> [5,0,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.642133713s) [3,4,2] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.516723633s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.720256805s) [5,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594848633s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.9( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.671127319s) [1,0,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.546264648s@ mbc={}] start_peering_interval up [3,5,1] -> [1,0,2], acting [3,5,1] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.9( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.671127319s) [1,0,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.546264648s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.646648407s) [4,2,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.522094727s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.795762062s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1136.671142578s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.719012260s) [3,2,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594726562s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.641854286s) [4,0,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.517700195s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,5], acting [2,1,3] -> [4,0,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.646572113s) [4,2,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.522094727s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664882660s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540893555s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,0], acting [1,2,0] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.795701981s) [2,1,3] r=1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1136.671142578s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.641773224s) [4,0,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.517700195s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664823532s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540893555s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.16( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663007736s) [0,2,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.539184570s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.16( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662962914s) [0,2,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.539184570s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718832970s) [3,2,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594726562s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.11( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.640521049s) [5,3,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.517089844s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,4], acting [2,1,3] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.17( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662548065s) [3,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.539184570s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,4], acting [3,5,1] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.11( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.640456200s) [5,3,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.517089844s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.17( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662475586s) [3,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.539184570s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663201332s) [5,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.540649414s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663105011s) [5,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.540649414s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.14( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664962769s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542480469s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,5], acting [3,5,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.14( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664916039s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542480469s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.16( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.716786385s) [0,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594604492s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,4], acting [0,5,1] -> [0,5,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.15( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.716793060s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594726562s@ mbc={}] start_peering_interval up [0,5,1] -> [2,4,0], acting [0,5,1] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.16( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.716584206s) [0,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594604492s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.15( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664614677s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542724609s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.15( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664572716s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542724609s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.17( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.716219902s) [1,0,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594482422s@ mbc={}] start_peering_interval up [0,5,1] -> [1,0,2], acting [0,5,1] -> [1,0,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663116455s) [0,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.541259766s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.17( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.716219902s) [1,0,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.594482422s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.12( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.670079231s) [0,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548461914s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.14( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718357086s) [3,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594604492s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,4], acting [0,5,1] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.12( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.670042992s) [0,1,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.548461914s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662890434s) [0,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.541259766s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.14( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.716118813s) [3,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594604492s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.10( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715950012s) [0,1,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594604492s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,2], acting [0,5,1] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.10( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715915680s) [0,1,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594604492s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.15( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.716266632s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594726562s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.13( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664826393s) [2,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543701172s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.13( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664752960s) [2,1,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543701172s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.11( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715749741s) [3,1,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594726562s@ mbc={}] start_peering_interval up [0,5,1] -> [3,1,2], acting [0,5,1] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.11( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715702057s) [3,1,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594726562s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.16( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.636794090s) [5,1,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.515869141s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,0], acting [2,1,3] -> [5,1,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663801193s) [0,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.542968750s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.16( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.636763573s) [5,1,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.515869141s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.12( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715131760s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594360352s@ mbc={}] start_peering_interval up [0,5,1] -> [4,2,0], acting [0,5,1] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.10( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664171219s) [3,4,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543457031s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,2], acting [3,5,1] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.12( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715093613s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594360352s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.10( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664110184s) [3,4,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543457031s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.17( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.637050629s) [5,1,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.516479492s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[2.17( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.636994362s) [5,1,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.516479492s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.11( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663640022s) [3,5,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.543090820s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,4], acting [3,5,1] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.11( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663602829s) [3,5,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.543090820s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.13( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715252876s) [3,4,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594848633s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,2], acting [0,5,1] -> [3,4,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.13( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715219498s) [3,4,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.594848633s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663603783s) [0,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.542968750s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.670928955s) [0,2,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.550659180s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,1], acting [1,2,0] -> [0,2,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.670892715s) [0,2,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.550659180s@ mbc={}] state: transitioning to Stray Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.668664932s) [0,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.548461914s@ mbc={}] start_peering_interval up [3,5,1] -> [0,5,1], acting [3,5,1] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.714536667s) [1,3,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.594360352s@ mbc={}] start_peering_interval up [0,5,1] -> [1,3,5], acting [0,5,1] -> [1,3,5], acting_primary 0 -> 1, up_primary 0 -> 1, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[6.1c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.714536667s) [1,3,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.594360352s@ mbc={}] state: transitioning to Primary Feb 1 02:59:32 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[4.1e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.668618202s) [0,5,1] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.548461914s@ mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.15 scrub starts Feb 1 02:59:33 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.15 scrub ok Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.1a( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [2,4,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.1a( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.1( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,3,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.e( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,4,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.3( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,0,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.14( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [2,4,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.18( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,1,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.15( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.14( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,4,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.16( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.11( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,3,4] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.5( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.9( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,5,4] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.16( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.15( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.e( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,4,2] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.2( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.2( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,0,4] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.1( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,4,5] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.1e( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.f( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,1,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.2( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.17( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.1f( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.1d( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.19( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,2,4] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.c( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.a( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.f( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.7( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.f( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.3( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.2( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.14( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.e( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.1a( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.17( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.10( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.11( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.13( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[2.1f( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [0,2,4] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.19( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.6( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.1f( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[3.1( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.5( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.16( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[6.18( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 44 pg[5.3( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 44 pg[4.b( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[2.1c( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,2,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[3.1d( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.1b( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,2,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[3.8( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[3.1b( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[2.1d( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,5,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[4.1d( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[6.1e( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[4.1f( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[4.19( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[4.1c( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.a( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,3,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[2.12( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,3,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[3.1a( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[3.9( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[5.e( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[2.18( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,2,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[4.15( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[4.6( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[2.f( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,2,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[3.11( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[5.b( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[5.13( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[5.4( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,3,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[5.1a( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[4.14( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[2.10( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,0,5] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[6.12( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,2,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[5.d( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[32376]: osd.4 pg_epoch: 45 pg[4.1( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[4.3( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[5.12( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[4.8( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,2,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[4.2( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.d( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,5,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.13( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,5,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[6.1( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[6.1c( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,3,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.5( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,0,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.b( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,5,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.c( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,0,5] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[3.e( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[5.8( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[4.9( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[6.17( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[2.15( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,0,2] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:33 localhost ceph-osd[31431]: osd.1 pg_epoch: 45 pg[6.1b( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,2,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.2( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.202056885s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1144.668212891s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.205460548s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1144.671752930s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.202080727s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1144.668212891s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.2( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.201891899s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1144.668212891s@ mbc={}] state: transitioning to Stray Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.201873779s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1144.668212891s@ mbc={}] state: transitioning to Stray Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.205233574s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1144.671752930s@ mbc={}] state: transitioning to Stray Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.205882072s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1144.671997070s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:35 localhost ceph-osd[31431]: osd.1 pg_epoch: 46 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.205310822s) [3,1,5] r=1 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1144.671997070s@ mbc={}] state: transitioning to Stray Feb 1 02:59:36 localhost python3[55184]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:59:37 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.e scrub starts Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.387936592s) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1143.889038086s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/46/0 sis=48 pruub=12.385694504s) [3,2,4] r=-1 lpr=48 pi=[44,48)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1143.886840820s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.387872696s) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1143.889038086s@ mbc={}] state: transitioning to Stray Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/46/0 sis=48 pruub=12.385635376s) [3,2,4] r=-1 lpr=48 pi=[44,48)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1143.886840820s@ mbc={}] state: transitioning to Stray Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.385939598s) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1143.886962891s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.385356903s) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1143.886962891s@ mbc={}] state: transitioning to Stray Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.388173103s) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1143.890258789s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:37 localhost ceph-osd[31431]: osd.1 pg_epoch: 48 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.388047218s) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1143.890258789s@ mbc={}] state: transitioning to Stray Feb 1 02:59:38 localhost python3[55200]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:59:38 localhost ceph-osd[32376]: osd.4 pg_epoch: 48 pg[7.7( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=44/44 les/c/f=45/46/0 sis=48) [3,2,4] r=2 lpr=48 pi=[44,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:38 localhost ceph-osd[32376]: osd.4 pg_epoch: 48 pg[7.3( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48) [3,2,4] r=2 lpr=48 pi=[42,48)/2 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:38 localhost ceph-osd[32376]: osd.4 pg_epoch: 48 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48) [3,2,4] r=2 lpr=48 pi=[42,48)/2 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:38 localhost ceph-osd[32376]: osd.4 pg_epoch: 48 pg[7.b( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48) [3,2,4] r=2 lpr=48 pi=[42,48)/2 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:39 localhost python3[55217]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:59:42 localhost python3[55265]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:59:42 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.15 scrub starts Feb 1 02:59:42 localhost python3[55308]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932781.8167174-91478-70699300167273/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=814f759dcc97f4b50c85badaa6f3819c2533c70a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:59:43 localhost ceph-osd[31431]: osd.1 pg_epoch: 50 pg[7.4( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=14.996114731s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1152.667358398s@ mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:43 localhost ceph-osd[31431]: osd.1 pg_epoch: 50 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=14.999659538s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1152.671020508s@ mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:43 localhost ceph-osd[31431]: osd.1 pg_epoch: 50 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=14.999547958s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1152.671020508s@ mbc={}] state: transitioning to Stray Feb 1 02:59:43 localhost ceph-osd[31431]: osd.1 pg_epoch: 50 pg[7.4( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=14.995899200s) [0,5,4] r=-1 lpr=50 pi=[42,50)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1152.667358398s@ mbc={}] state: transitioning to Stray Feb 1 02:59:44 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.e scrub starts Feb 1 02:59:44 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.e scrub ok Feb 1 02:59:44 localhost ceph-osd[32376]: osd.4 pg_epoch: 50 pg[7.4( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=2 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:44 localhost ceph-osd[32376]: osd.4 pg_epoch: 50 pg[7.c( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50) [0,5,4] r=2 lpr=50 pi=[42,50)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:45 localhost ceph-osd[32376]: osd.4 pg_epoch: 52 pg[7.5( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=44/44 les/c/f=45/48/0 sis=52) [4,0,2] r=0 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:45 localhost ceph-osd[32376]: osd.4 pg_epoch: 52 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=44/44 les/c/f=45/49/0 sis=52) [4,0,2] r=0 lpr=52 pi=[44,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:45 localhost ceph-osd[31431]: osd.1 pg_epoch: 52 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/44 les/c/f=45/48/0 sis=52 pruub=12.205143929s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1151.888916016s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:45 localhost ceph-osd[31431]: osd.1 pg_epoch: 52 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/44 les/c/f=45/48/0 sis=52 pruub=12.205084801s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1151.888916016s@ mbc={}] state: transitioning to Stray Feb 1 02:59:45 localhost ceph-osd[31431]: osd.1 pg_epoch: 52 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/49/0 sis=52 pruub=12.203011513s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1151.886840820s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:45 localhost ceph-osd[31431]: osd.1 pg_epoch: 52 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/49/0 sis=52 pruub=12.202926636s) [4,0,2] r=-1 lpr=52 pi=[44,52)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1151.886840820s@ mbc={}] state: transitioning to Stray Feb 1 02:59:46 localhost ceph-osd[32376]: osd.4 pg_epoch: 53 pg[7.5( v 34'39 lc 34'11 (0'0,34'39] local-lis/les=52/53 n=2 ec=42/32 lis/c=44/44 les/c/f=45/48/0 sis=52) [4,0,2] r=0 lpr=52 pi=[44,52)/1 crt=34'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(1+2)=2}}] state: react AllReplicasActivated Activating complete Feb 1 02:59:46 localhost ceph-osd[32376]: osd.4 pg_epoch: 53 pg[7.d( v 34'39 lc 34'13 (0'0,34'39] local-lis/les=52/53 n=1 ec=42/32 lis/c=44/44 les/c/f=45/49/0 sis=52) [4,0,2] r=0 lpr=52 pi=[44,52)/1 crt=34'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(1+2)=2}}] state: react AllReplicasActivated Activating complete Feb 1 02:59:47 localhost python3[55370]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:59:47 localhost python3[55413]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932786.9076765-91478-138582155510828/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=9a0c41ba35379304dc7e57883346ea3531963e9b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:59:47 localhost ceph-osd[31431]: osd.1 pg_epoch: 54 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=46/47 n=2 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.735357285s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1154.491699219s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:47 localhost ceph-osd[31431]: osd.1 pg_epoch: 54 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.735321999s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1154.491699219s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:47 localhost ceph-osd[31431]: osd.1 pg_epoch: 54 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.735236168s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1154.491699219s@ mbc={}] state: transitioning to Stray Feb 1 02:59:47 localhost ceph-osd[31431]: osd.1 pg_epoch: 54 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=46/47 n=2 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.735208511s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1154.491699219s@ mbc={}] state: transitioning to Stray Feb 1 02:59:48 localhost ceph-osd[32376]: osd.4 pg_epoch: 55 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.786808968s) [2,1,3] r=-1 lpr=55 pi=[48,55)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1152.585083008s@ mbc={}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:48 localhost ceph-osd[32376]: osd.4 pg_epoch: 55 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.786581039s) [2,1,3] r=-1 lpr=55 pi=[48,55)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1152.585083008s@ mbc={}] state: transitioning to Stray Feb 1 02:59:48 localhost ceph-osd[32376]: osd.4 pg_epoch: 55 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.786458015s) [2,1,3] r=-1 lpr=55 pi=[48,55)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1152.585083008s@ mbc={}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:48 localhost ceph-osd[32376]: osd.4 pg_epoch: 55 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.786385536s) [2,1,3] r=-1 lpr=55 pi=[48,55)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1152.585083008s@ mbc={}] state: transitioning to Stray Feb 1 02:59:48 localhost ceph-osd[32376]: osd.4 pg_epoch: 54 pg[7.6( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=2 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:48 localhost ceph-osd[32376]: osd.4 pg_epoch: 54 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=2 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 02:59:49 localhost ceph-osd[31431]: osd.1 pg_epoch: 55 pg[7.7( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55) [2,1,3] r=1 lpr=55 pi=[48,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:49 localhost ceph-osd[31431]: osd.1 pg_epoch: 55 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55) [2,1,3] r=1 lpr=55 pi=[48,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:49 localhost podman[55428]: 2026-02-01 07:59:49.719773134 +0000 UTC m=+0.079630402 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git) Feb 1 02:59:49 localhost podman[55428]: 2026-02-01 07:59:49.914418883 +0000 UTC m=+0.274276191 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, version=17.1.13) Feb 1 02:59:49 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 02:59:50 localhost ceph-osd[31431]: osd.1 pg_epoch: 57 pg[7.8( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=57 pruub=15.838874817s) [3,2,1] r=2 lpr=57 pi=[42,57)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1160.672363281s@ mbc={}] start_peering_interval up [5,1,3] -> [3,2,1], acting [5,1,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:50 localhost ceph-osd[31431]: osd.1 pg_epoch: 57 pg[7.8( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=57 pruub=15.838774681s) [3,2,1] r=2 lpr=57 pi=[42,57)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1160.672363281s@ mbc={}] state: transitioning to Stray Feb 1 02:59:52 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.1b scrub starts Feb 1 02:59:52 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.1b scrub ok Feb 1 02:59:52 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.1b deep-scrub starts Feb 1 02:59:52 localhost python3[55503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:59:52 localhost python3[55546]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932792.1963208-91478-29120135758828/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=c332e57191fea146df898938173f766e25b9bcd9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 02:59:53 localhost ceph-osd[31431]: osd.1 pg_epoch: 59 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/45/0 sis=59 pruub=12.398035049s) [0,4,2] r=-1 lpr=59 pi=[44,59)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1159.890014648s@ mbc={}] start_peering_interval up [2,1,3] -> [0,4,2], acting [2,1,3] -> [0,4,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:53 localhost ceph-osd[31431]: osd.1 pg_epoch: 59 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/45/0 sis=59 pruub=12.397917747s) [0,4,2] r=-1 lpr=59 pi=[44,59)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1159.890014648s@ mbc={}] state: transitioning to Stray Feb 1 02:59:53 localhost ceph-osd[32376]: osd.4 pg_epoch: 59 pg[7.9( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=44/44 les/c/f=45/45/0 sis=59) [0,4,2] r=1 lpr=59 pi=[44,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 02:59:54 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.d scrub starts Feb 1 02:59:54 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.d scrub ok Feb 1 02:59:55 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.9 scrub starts Feb 1 02:59:55 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.9 scrub ok Feb 1 02:59:55 localhost ceph-osd[32376]: osd.4 pg_epoch: 61 pg[7.a( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=61) [4,0,5] r=0 lpr=61 pi=[46,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 1 02:59:55 localhost ceph-osd[31431]: osd.1 pg_epoch: 61 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=61 pruub=12.816160202s) [4,0,5] r=-1 lpr=61 pi=[46,61)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1162.492065430s@ mbc={}] start_peering_interval up [3,1,5] -> [4,0,5], acting [3,1,5] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:55 localhost ceph-osd[31431]: osd.1 pg_epoch: 61 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=61 pruub=12.815984726s) [4,0,5] r=-1 lpr=61 pi=[46,61)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1162.492065430s@ mbc={}] state: transitioning to Stray Feb 1 02:59:56 localhost ceph-osd[32376]: osd.4 pg_epoch: 62 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=61/62 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=61) [4,0,5] r=0 lpr=61 pi=[46,61)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 1 02:59:57 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.b deep-scrub starts Feb 1 02:59:57 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.b deep-scrub ok Feb 1 02:59:58 localhost python3[55608]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 02:59:58 localhost ceph-osd[32376]: osd.4 pg_epoch: 64 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=50/51 n=1 ec=42/32 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=9.911370277s) [2,3,4] r=2 lpr=64 pi=[50,64)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1158.735473633s@ mbc={}] start_peering_interval up [0,5,4] -> [2,3,4], acting [0,5,4] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 02:59:58 localhost ceph-osd[32376]: osd.4 pg_epoch: 64 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=50/51 n=1 ec=42/32 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=9.911283493s) [2,3,4] r=2 lpr=64 pi=[50,64)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1158.735473633s@ mbc={}] state: transitioning to Stray Feb 1 02:59:59 localhost python3[55653]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932798.2759593-91836-155026934094631/source _original_basename=tmpl4dion2q follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:00 localhost python3[55715]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:00 localhost python3[55758]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932799.7760215-91922-56199648342685/source _original_basename=tmpudcjtv1e follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:00 localhost ceph-osd[32376]: osd.4 pg_epoch: 66 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=52/53 n=1 ec=42/32 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=9.915838242s) [2,3,1] r=-1 lpr=66 pi=[52,66)/1 crt=34'39 mlcod 0'0 active pruub 1160.763305664s@ mbc={255={}}] start_peering_interval up [4,0,2] -> [2,3,1], acting [4,0,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 03:00:00 localhost ceph-osd[32376]: osd.4 pg_epoch: 66 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=52/53 n=1 ec=42/32 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=9.914987564s) [2,3,1] r=-1 lpr=66 pi=[52,66)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1160.763305664s@ mbc={}] state: transitioning to Stray Feb 1 03:00:01 localhost python3[55788]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Feb 1 03:00:01 localhost python3[55806]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:00:01 localhost ceph-osd[31431]: osd.1 pg_epoch: 66 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=52/52 les/c/f=53/53/0 sis=66) [2,3,1] r=2 lpr=66 pi=[52,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 03:00:02 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.d deep-scrub starts Feb 1 03:00:02 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.d deep-scrub ok Feb 1 03:00:03 localhost ansible-async_wrapper.py[55978]: Invoked with 291629447707 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932802.4907281-92016-230259197751870/AnsiballZ_command.py _ Feb 1 03:00:03 localhost ansible-async_wrapper.py[55981]: Starting module and watcher Feb 1 03:00:03 localhost ansible-async_wrapper.py[55981]: Start watching 55982 (3600) Feb 1 03:00:03 localhost ansible-async_wrapper.py[55982]: Start module (55982) Feb 1 03:00:03 localhost ansible-async_wrapper.py[55978]: Return async_wrapper task started. Feb 1 03:00:03 localhost ceph-osd[32376]: osd.4 pg_epoch: 68 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=54/55 n=1 ec=42/32 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=9.449617386s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1162.862060547s@ mbc={}] start_peering_interval up [0,2,4] -> [3,1,5], acting [0,2,4] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 03:00:03 localhost ceph-osd[32376]: osd.4 pg_epoch: 68 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=54/55 n=1 ec=42/32 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=9.449234009s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1162.862060547s@ mbc={}] state: transitioning to Stray Feb 1 03:00:03 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.13 scrub starts Feb 1 03:00:03 localhost python3[56002]: ansible-ansible.legacy.async_status Invoked with jid=291629447707.55978 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:00:03 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.13 scrub ok Feb 1 03:00:04 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.13 scrub starts Feb 1 03:00:04 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.13 scrub ok Feb 1 03:00:04 localhost ceph-osd[31431]: osd.1 pg_epoch: 68 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=54/54 les/c/f=55/55/0 sis=68) [3,1,5] r=1 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 03:00:05 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.8 scrub starts Feb 1 03:00:05 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.8 scrub ok Feb 1 03:00:05 localhost ceph-osd[31431]: osd.1 pg_epoch: 70 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=55/56 n=1 ec=42/32 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=8.123973846s) [0,4,5] r=-1 lpr=70 pi=[55,70)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1167.813232422s@ mbc={}] start_peering_interval up [2,1,3] -> [0,4,5], acting [2,1,3] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 1 03:00:05 localhost ceph-osd[31431]: osd.1 pg_epoch: 70 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=55/56 n=1 ec=42/32 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=8.123892784s) [0,4,5] r=-1 lpr=70 pi=[55,70)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1167.813232422s@ mbc={}] state: transitioning to Stray Feb 1 03:00:06 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.11 scrub starts Feb 1 03:00:06 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.11 scrub ok Feb 1 03:00:06 localhost ceph-osd[32376]: osd.4 pg_epoch: 70 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=55/55 les/c/f=56/56/0 sis=70) [0,4,5] r=1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 1 03:00:06 localhost puppet-user[56001]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 03:00:06 localhost puppet-user[56001]: (file: /etc/puppet/hiera.yaml) Feb 1 03:00:06 localhost puppet-user[56001]: Warning: Undefined variable '::deploy_config_name'; Feb 1 03:00:06 localhost puppet-user[56001]: (file & line not available) Feb 1 03:00:06 localhost puppet-user[56001]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 03:00:06 localhost puppet-user[56001]: (file & line not available) Feb 1 03:00:06 localhost puppet-user[56001]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 1 03:00:06 localhost puppet-user[56001]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 1 03:00:06 localhost puppet-user[56001]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.17 seconds Feb 1 03:00:07 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.1a scrub starts Feb 1 03:00:07 localhost puppet-user[56001]: Notice: Applied catalog in 0.04 seconds Feb 1 03:00:07 localhost puppet-user[56001]: Application: Feb 1 03:00:07 localhost puppet-user[56001]: Initial environment: production Feb 1 03:00:07 localhost puppet-user[56001]: Converged environment: production Feb 1 03:00:07 localhost puppet-user[56001]: Run mode: user Feb 1 03:00:07 localhost puppet-user[56001]: Changes: Feb 1 03:00:07 localhost puppet-user[56001]: Events: Feb 1 03:00:07 localhost puppet-user[56001]: Resources: Feb 1 03:00:07 localhost puppet-user[56001]: Total: 10 Feb 1 03:00:07 localhost puppet-user[56001]: Time: Feb 1 03:00:07 localhost puppet-user[56001]: Schedule: 0.00 Feb 1 03:00:07 localhost puppet-user[56001]: File: 0.00 Feb 1 03:00:07 localhost puppet-user[56001]: Exec: 0.01 Feb 1 03:00:07 localhost puppet-user[56001]: Augeas: 0.01 Feb 1 03:00:07 localhost puppet-user[56001]: Transaction evaluation: 0.03 Feb 1 03:00:07 localhost puppet-user[56001]: Catalog application: 0.04 Feb 1 03:00:07 localhost puppet-user[56001]: Config retrieval: 0.26 Feb 1 03:00:07 localhost puppet-user[56001]: Last run: 1769932807 Feb 1 03:00:07 localhost puppet-user[56001]: Filebucket: 0.00 Feb 1 03:00:07 localhost puppet-user[56001]: Total: 0.04 Feb 1 03:00:07 localhost puppet-user[56001]: Version: Feb 1 03:00:07 localhost puppet-user[56001]: Config: 1769932806 Feb 1 03:00:07 localhost puppet-user[56001]: Puppet: 7.10.0 Feb 1 03:00:07 localhost ansible-async_wrapper.py[55982]: Module complete (55982) Feb 1 03:00:07 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 3.1a scrub ok Feb 1 03:00:07 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.17 scrub starts Feb 1 03:00:07 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.17 scrub ok Feb 1 03:00:08 localhost ansible-async_wrapper.py[55981]: Done in kid B. Feb 1 03:00:09 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.12 scrub starts Feb 1 03:00:09 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.12 scrub ok Feb 1 03:00:12 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.18 scrub starts Feb 1 03:00:12 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.18 scrub ok Feb 1 03:00:12 localhost sshd[56189]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:00:13 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.f scrub starts Feb 1 03:00:13 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.f scrub ok Feb 1 03:00:13 localhost python3[56206]: ansible-ansible.legacy.async_status Invoked with jid=291629447707.55978 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:00:14 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.8 scrub starts Feb 1 03:00:14 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.8 scrub ok Feb 1 03:00:14 localhost python3[56222]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:00:14 localhost python3[56238]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:00:15 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts Feb 1 03:00:15 localhost python3[56288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:15 localhost python3[56306]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp0d1i9bj4 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:00:15 localhost python3[56336]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:15 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.1d scrub starts Feb 1 03:00:16 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.1d scrub ok Feb 1 03:00:16 localhost python3[56439]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 1 03:00:17 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.1c scrub starts Feb 1 03:00:17 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.1c scrub ok Feb 1 03:00:17 localhost python3[56458]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:18 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.9 scrub starts Feb 1 03:00:18 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.9 scrub ok Feb 1 03:00:18 localhost python3[56490]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:00:19 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.10 scrub starts Feb 1 03:00:19 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 2.10 scrub ok Feb 1 03:00:19 localhost python3[56540]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:19 localhost python3[56558]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:19 localhost python3[56620]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:00:20 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.8 scrub starts Feb 1 03:00:20 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.8 scrub ok Feb 1 03:00:20 localhost systemd[1]: tmp-crun.pVJkAQ.mount: Deactivated successfully. Feb 1 03:00:20 localhost podman[56639]: 2026-02-01 08:00:20.20100677 +0000 UTC m=+0.073864188 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13) Feb 1 03:00:20 localhost python3[56638]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:20 localhost podman[56639]: 2026-02-01 08:00:20.438528362 +0000 UTC m=+0.311385840 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:00:20 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:00:20 localhost python3[56731]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:20 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts Feb 1 03:00:20 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.14 deep-scrub ok Feb 1 03:00:21 localhost python3[56749]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:21 localhost python3[56811]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:21 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.15 scrub starts Feb 1 03:00:21 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.15 scrub ok Feb 1 03:00:21 localhost python3[56829]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:22 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.b scrub starts Feb 1 03:00:22 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.b scrub ok Feb 1 03:00:22 localhost python3[56859]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:00:22 localhost systemd[1]: Reloading. Feb 1 03:00:22 localhost systemd-rc-local-generator[56882]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:00:22 localhost systemd-sysv-generator[56887]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:00:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:00:23 localhost python3[56945]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:23 localhost python3[56963]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:24 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.a scrub starts Feb 1 03:00:24 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.a scrub ok Feb 1 03:00:24 localhost python3[57025]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:00:24 localhost python3[57043]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:24 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.1 scrub starts Feb 1 03:00:25 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.1 scrub ok Feb 1 03:00:25 localhost python3[57073]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:00:25 localhost systemd[1]: Reloading. Feb 1 03:00:25 localhost systemd-sysv-generator[57102]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:00:25 localhost systemd-rc-local-generator[57099]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:00:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:00:25 localhost systemd[1]: Starting Create netns directory... Feb 1 03:00:25 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 03:00:25 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 03:00:25 localhost systemd[1]: Finished Create netns directory. Feb 1 03:00:25 localhost python3[57130]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 1 03:00:25 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.6 scrub starts Feb 1 03:00:26 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.6 scrub ok Feb 1 03:00:27 localhost python3[57189]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 1 03:00:27 localhost podman[57262]: 2026-02-01 08:00:27.695070267 +0000 UTC m=+0.089393999 container create e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, container_name=nova_virtqemud_init_logs, architecture=x86_64, config_id=tripleo_step2, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 03:00:27 localhost podman[57263]: 2026-02-01 08:00:27.71292781 +0000 UTC m=+0.101014962 container create c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute_init_log, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, version=17.1.13, vcs-type=git, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public) Feb 1 03:00:27 localhost podman[57262]: 2026-02-01 08:00:27.641788437 +0000 UTC m=+0.036112249 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:00:27 localhost systemd[1]: Started libpod-conmon-e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da.scope. Feb 1 03:00:27 localhost systemd[1]: Started libpod-conmon-c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad.scope. Feb 1 03:00:27 localhost systemd[1]: Started libcrun container. Feb 1 03:00:27 localhost systemd[1]: Started libcrun container. Feb 1 03:00:27 localhost podman[57263]: 2026-02-01 08:00:27.664866529 +0000 UTC m=+0.052953671 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:00:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a27148de7468c07eed6e32a3dde6476496248367a0cf1cb0c4fa7a063e687a70/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Feb 1 03:00:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7482678b01acdac3c11b06ce351fa229c86390ea5e0f34b89b4352735f8e55c8/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:00:27 localhost podman[57263]: 2026-02-01 08:00:27.776122732 +0000 UTC m=+0.164209854 container init c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, container_name=nova_compute_init_log, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:00:27 localhost podman[57263]: 2026-02-01 08:00:27.784675802 +0000 UTC m=+0.172762924 container start c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, io.buildah.version=1.41.5, distribution-scope=public, container_name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:00:27 localhost systemd[1]: libpod-c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad.scope: Deactivated successfully. Feb 1 03:00:27 localhost python3[57189]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Feb 1 03:00:27 localhost podman[57262]: 2026-02-01 08:00:27.827271757 +0000 UTC m=+0.221595479 container init e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, release=1766032510, vcs-type=git, container_name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, config_id=tripleo_step2, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13) Feb 1 03:00:27 localhost podman[57262]: 2026-02-01 08:00:27.833574358 +0000 UTC m=+0.227898090 container start e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, tcib_managed=true, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z) Feb 1 03:00:27 localhost python3[57189]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Feb 1 03:00:27 localhost systemd[1]: libpod-e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da.scope: Deactivated successfully. Feb 1 03:00:27 localhost podman[57299]: 2026-02-01 08:00:27.851392921 +0000 UTC m=+0.045433023 container died c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, distribution-scope=public, build-date=2026-01-12T23:32:04Z, container_name=nova_compute_init_log, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:00:27 localhost podman[57323]: 2026-02-01 08:00:27.901950888 +0000 UTC m=+0.050476906 container died e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step2, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=nova_virtqemud_init_logs, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:00:27 localhost podman[57298]: 2026-02-01 08:00:27.958748515 +0000 UTC m=+0.160972655 container cleanup c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step2, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Feb 1 03:00:27 localhost systemd[1]: libpod-conmon-c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad.scope: Deactivated successfully. Feb 1 03:00:28 localhost podman[57325]: 2026-02-01 08:00:28.00466461 +0000 UTC m=+0.150370542 container cleanup e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64) Feb 1 03:00:28 localhost systemd[1]: libpod-conmon-e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da.scope: Deactivated successfully. Feb 1 03:00:28 localhost podman[57449]: 2026-02-01 08:00:28.252295511 +0000 UTC m=+0.090623407 container create 5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=create_virtlogd_wrapper, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt) Feb 1 03:00:28 localhost podman[57460]: 2026-02-01 08:00:28.280071545 +0000 UTC m=+0.084056866 container create e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vendor=Red Hat, Inc., config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=create_haproxy_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:00:28 localhost systemd[1]: Started libpod-conmon-5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07.scope. Feb 1 03:00:28 localhost systemd[1]: Started libcrun container. Feb 1 03:00:28 localhost systemd[1]: Started libpod-conmon-e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e.scope. Feb 1 03:00:28 localhost podman[57449]: 2026-02-01 08:00:28.213856302 +0000 UTC m=+0.052184258 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:00:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b46f2af4abb8fc370c0b24f2088d5fdc184462c273d81ea584e1e57a2efd340/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 1 03:00:28 localhost systemd[1]: Started libcrun container. Feb 1 03:00:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a853575c374606d160c17f2d5aa41b97e721ba5a6a30f8d45cba4a87628c37d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 03:00:28 localhost podman[57449]: 2026-02-01 08:00:28.332520309 +0000 UTC m=+0.170848235 container init 5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, architecture=x86_64, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, container_name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public) Feb 1 03:00:28 localhost podman[57460]: 2026-02-01 08:00:28.334972124 +0000 UTC m=+0.138957475 container init e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step2, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z) Feb 1 03:00:28 localhost podman[57449]: 2026-02-01 08:00:28.342355229 +0000 UTC m=+0.180683135 container start 5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5) Feb 1 03:00:28 localhost podman[57449]: 2026-02-01 08:00:28.342577856 +0000 UTC m=+0.180905822 container attach 5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_step2, io.openshift.expose-services=, architecture=x86_64, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc.) Feb 1 03:00:28 localhost podman[57460]: 2026-02-01 08:00:28.243849284 +0000 UTC m=+0.047834615 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 1 03:00:28 localhost podman[57460]: 2026-02-01 08:00:28.39501922 +0000 UTC m=+0.199004621 container start e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=create_haproxy_wrapper, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=) Feb 1 03:00:28 localhost podman[57460]: 2026-02-01 08:00:28.396154785 +0000 UTC m=+0.200140146 container attach e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step2, tcib_managed=true, container_name=create_haproxy_wrapper, build-date=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:00:28 localhost systemd[1]: var-lib-containers-storage-overlay-7482678b01acdac3c11b06ce351fa229c86390ea5e0f34b89b4352735f8e55c8-merged.mount: Deactivated successfully. Feb 1 03:00:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c39f7a4be7e1d9e2c9ba82d3ea1d7c93da83d479ae18fbed695900bed714b5ad-userdata-shm.mount: Deactivated successfully. Feb 1 03:00:28 localhost systemd[1]: var-lib-containers-storage-overlay-a27148de7468c07eed6e32a3dde6476496248367a0cf1cb0c4fa7a063e687a70-merged.mount: Deactivated successfully. Feb 1 03:00:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e3c77e43b96e822a333a245a2a70f91097beb0215a710fe8a9c72b3d8b7e08da-userdata-shm.mount: Deactivated successfully. Feb 1 03:00:30 localhost ovs-vsctl[57576]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Feb 1 03:00:30 localhost systemd[1]: libpod-5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07.scope: Deactivated successfully. Feb 1 03:00:30 localhost systemd[1]: libpod-5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07.scope: Consumed 2.324s CPU time. Feb 1 03:00:30 localhost podman[57449]: 2026-02-01 08:00:30.671607193 +0000 UTC m=+2.509935099 container died 5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=create_virtlogd_wrapper, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}) Feb 1 03:00:30 localhost systemd[1]: tmp-crun.i9Hf89.mount: Deactivated successfully. Feb 1 03:00:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07-userdata-shm.mount: Deactivated successfully. Feb 1 03:00:30 localhost podman[57703]: 2026-02-01 08:00:30.757509215 +0000 UTC m=+0.072265799 container cleanup 5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step2, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=create_virtlogd_wrapper, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 03:00:30 localhost systemd[1]: libpod-conmon-5e1dbfd935609409d724df1267026ff6ea16cdb02fde05dacb312b0ca8513e07.scope: Deactivated successfully. Feb 1 03:00:30 localhost python3[57189]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Feb 1 03:00:31 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.5 scrub starts Feb 1 03:00:31 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.5 scrub ok Feb 1 03:00:31 localhost systemd[1]: libpod-e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e.scope: Deactivated successfully. Feb 1 03:00:31 localhost systemd[1]: libpod-e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e.scope: Consumed 2.275s CPU time. Feb 1 03:00:31 localhost podman[57460]: 2026-02-01 08:00:31.479019604 +0000 UTC m=+3.283004995 container died e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:56:19Z, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, container_name=create_haproxy_wrapper, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step2, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:00:31 localhost podman[57741]: 2026-02-01 08:00:31.564597116 +0000 UTC m=+0.072016291 container cleanup e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, build-date=2026-01-12T22:56:19Z, container_name=create_haproxy_wrapper, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step2, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 1 03:00:31 localhost systemd[1]: libpod-conmon-e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e.scope: Deactivated successfully. Feb 1 03:00:31 localhost python3[57189]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Feb 1 03:00:31 localhost systemd[1]: var-lib-containers-storage-overlay-4a853575c374606d160c17f2d5aa41b97e721ba5a6a30f8d45cba4a87628c37d-merged.mount: Deactivated successfully. Feb 1 03:00:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e18821760ea9a96f8bb0c1632262a280e0906bd6fc8d44b0a0ba04d49866378e-userdata-shm.mount: Deactivated successfully. Feb 1 03:00:31 localhost systemd[1]: var-lib-containers-storage-overlay-8b46f2af4abb8fc370c0b24f2088d5fdc184462c273d81ea584e1e57a2efd340-merged.mount: Deactivated successfully. Feb 1 03:00:32 localhost python3[57795]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:33 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.3 scrub starts Feb 1 03:00:33 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.3 scrub ok Feb 1 03:00:33 localhost python3[57916]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005604212 step=2 update_config_hash_only=False Feb 1 03:00:33 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.1d scrub starts Feb 1 03:00:33 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.1d scrub ok Feb 1 03:00:34 localhost python3[57932]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:00:34 localhost python3[57948]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 1 03:00:39 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.2 scrub starts Feb 1 03:00:39 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.2 scrub ok Feb 1 03:00:41 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.1f scrub starts Feb 1 03:00:41 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 4.1f scrub ok Feb 1 03:00:41 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 6.1e deep-scrub starts Feb 1 03:00:42 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 6.1e deep-scrub ok Feb 1 03:00:42 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 7.5 scrub starts Feb 1 03:00:43 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 7.5 scrub ok Feb 1 03:00:43 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 7.a scrub starts Feb 1 03:00:44 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 7.a scrub ok Feb 1 03:00:44 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.e deep-scrub starts Feb 1 03:00:44 localhost ceph-osd[32376]: log_channel(cluster) log [DBG] : 5.e deep-scrub ok Feb 1 03:00:45 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.1c scrub starts Feb 1 03:00:45 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.1c scrub ok Feb 1 03:00:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:00:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 5098 writes, 22K keys, 5098 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5098 writes, 516 syncs, 9.88 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1845 writes, 6645 keys, 1845 commit groups, 1.0 writes per commit group, ingest: 2.71 MB, 0.00 MB/s#012Interval WAL: 1845 writes, 374 syncs, 4.93 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me Feb 1 03:00:48 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.1b scrub starts Feb 1 03:00:48 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.1b scrub ok Feb 1 03:00:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:00:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4412 writes, 20K keys, 4412 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4412 writes, 485 syncs, 9.10 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1028 writes, 3617 keys, 1028 commit groups, 1.0 writes per commit group, ingest: 1.94 MB, 0.00 MB/s#012Interval WAL: 1028 writes, 290 syncs, 3.54 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me Feb 1 03:00:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:00:50 localhost systemd[1]: tmp-crun.MvFu19.mount: Deactivated successfully. Feb 1 03:00:50 localhost podman[57949]: 2026-02-01 08:00:50.747265912 +0000 UTC m=+0.101036613 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:00:50 localhost podman[57949]: 2026-02-01 08:00:50.980110712 +0000 UTC m=+0.333881323 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 1 03:00:50 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:00:54 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.1d scrub starts Feb 1 03:00:54 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 3.1d scrub ok Feb 1 03:00:55 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.19 scrub starts Feb 1 03:00:55 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 4.19 scrub ok Feb 1 03:00:56 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.1 scrub starts Feb 1 03:00:56 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.1 scrub ok Feb 1 03:00:57 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.1c scrub starts Feb 1 03:00:57 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.1c scrub ok Feb 1 03:00:58 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.c scrub starts Feb 1 03:00:58 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.c scrub ok Feb 1 03:01:06 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.4 scrub starts Feb 1 03:01:06 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.4 scrub ok Feb 1 03:01:09 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.1a scrub starts Feb 1 03:01:09 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.1a scrub ok Feb 1 03:01:15 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.15 deep-scrub starts Feb 1 03:01:15 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 2.15 deep-scrub ok Feb 1 03:01:17 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.1b deep-scrub starts Feb 1 03:01:17 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 6.1b deep-scrub ok Feb 1 03:01:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:01:21 localhost systemd[1]: tmp-crun.3jn0cF.mount: Deactivated successfully. Feb 1 03:01:21 localhost podman[58117]: 2026-02-01 08:01:21.74422688 +0000 UTC m=+0.102980512 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13) Feb 1 03:01:21 localhost podman[58117]: 2026-02-01 08:01:21.949318095 +0000 UTC m=+0.308071717 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr) Feb 1 03:01:21 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:01:23 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts Feb 1 03:01:23 localhost ceph-osd[31431]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok Feb 1 03:01:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:01:52 localhost systemd[1]: tmp-crun.tXTxWW.mount: Deactivated successfully. Feb 1 03:01:52 localhost podman[58147]: 2026-02-01 08:01:52.726156068 +0000 UTC m=+0.082574741 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, release=1766032510, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:01:52 localhost podman[58147]: 2026-02-01 08:01:52.954306435 +0000 UTC m=+0.310725028 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:01:52 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:02:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:02:23 localhost systemd[1]: tmp-crun.f9laEj.mount: Deactivated successfully. Feb 1 03:02:23 localhost podman[58252]: 2026-02-01 08:02:23.735098497 +0000 UTC m=+0.092934216 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, version=17.1.13, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=metrics_qdr, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:02:23 localhost podman[58252]: 2026-02-01 08:02:23.927837468 +0000 UTC m=+0.285673267 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:02:23 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:02:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:02:54 localhost podman[58281]: 2026-02-01 08:02:54.694528334 +0000 UTC m=+0.059455860 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Feb 1 03:02:54 localhost podman[58281]: 2026-02-01 08:02:54.875539377 +0000 UTC m=+0.240466903 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:02:54 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:03:05 localhost sshd[58309]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:03:15 localhost podman[58414]: 2026-02-01 08:03:15.199445915 +0000 UTC m=+0.091185993 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, release=1764794109, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, RELEASE=main, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_CLEAN=True) Feb 1 03:03:15 localhost podman[58414]: 2026-02-01 08:03:15.336954586 +0000 UTC m=+0.228694674 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, name=rhceph, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=) Feb 1 03:03:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:03:25 localhost systemd[1]: tmp-crun.mCZqui.mount: Deactivated successfully. Feb 1 03:03:25 localhost podman[58559]: 2026-02-01 08:03:25.735645724 +0000 UTC m=+0.091154743 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, version=17.1.13, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:03:26 localhost podman[58559]: 2026-02-01 08:03:26.011362228 +0000 UTC m=+0.366871237 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:03:26 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:03:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:03:56 localhost podman[58588]: 2026-02-01 08:03:56.709700374 +0000 UTC m=+0.070614648 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:03:56 localhost podman[58588]: 2026-02-01 08:03:56.926356661 +0000 UTC m=+0.287270925 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:03:56 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:04:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:04:27 localhost podman[58695]: 2026-02-01 08:04:27.729161394 +0000 UTC m=+0.090764032 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 1 03:04:27 localhost podman[58695]: 2026-02-01 08:04:27.957558928 +0000 UTC m=+0.319161546 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Feb 1 03:04:27 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:04:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:04:58 localhost systemd[1]: tmp-crun.AzSg93.mount: Deactivated successfully. Feb 1 03:04:58 localhost podman[58724]: 2026-02-01 08:04:58.702340438 +0000 UTC m=+0.062319026 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, vcs-type=git) Feb 1 03:04:58 localhost podman[58724]: 2026-02-01 08:04:58.903314579 +0000 UTC m=+0.263293177 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:04:58 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:05:10 localhost python3[58801]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:10 localhost python3[58846]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933110.2604609-98261-87642172900934/source _original_basename=tmpyresjjq4 follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:11 localhost python3[58876]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:13 localhost ansible-async_wrapper.py[59048]: Invoked with 256115419876 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933112.906836-98418-117072214249865/AnsiballZ_command.py _ Feb 1 03:05:13 localhost ansible-async_wrapper.py[59051]: Starting module and watcher Feb 1 03:05:13 localhost ansible-async_wrapper.py[59051]: Start watching 59052 (3600) Feb 1 03:05:13 localhost ansible-async_wrapper.py[59052]: Start module (59052) Feb 1 03:05:13 localhost ansible-async_wrapper.py[59048]: Return async_wrapper task started. Feb 1 03:05:13 localhost python3[59072]: ansible-ansible.legacy.async_status Invoked with jid=256115419876.59048 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:05:17 localhost puppet-user[59071]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 03:05:17 localhost puppet-user[59071]: (file: /etc/puppet/hiera.yaml) Feb 1 03:05:17 localhost puppet-user[59071]: Warning: Undefined variable '::deploy_config_name'; Feb 1 03:05:17 localhost puppet-user[59071]: (file & line not available) Feb 1 03:05:17 localhost puppet-user[59071]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 03:05:17 localhost puppet-user[59071]: (file & line not available) Feb 1 03:05:17 localhost puppet-user[59071]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 1 03:05:17 localhost puppet-user[59071]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 1 03:05:17 localhost puppet-user[59071]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.12 seconds Feb 1 03:05:17 localhost puppet-user[59071]: Notice: Applied catalog in 0.04 seconds Feb 1 03:05:17 localhost puppet-user[59071]: Application: Feb 1 03:05:17 localhost puppet-user[59071]: Initial environment: production Feb 1 03:05:17 localhost puppet-user[59071]: Converged environment: production Feb 1 03:05:17 localhost puppet-user[59071]: Run mode: user Feb 1 03:05:17 localhost puppet-user[59071]: Changes: Feb 1 03:05:17 localhost puppet-user[59071]: Events: Feb 1 03:05:17 localhost puppet-user[59071]: Resources: Feb 1 03:05:17 localhost puppet-user[59071]: Total: 10 Feb 1 03:05:17 localhost puppet-user[59071]: Time: Feb 1 03:05:17 localhost puppet-user[59071]: Filebucket: 0.00 Feb 1 03:05:17 localhost puppet-user[59071]: Schedule: 0.00 Feb 1 03:05:17 localhost puppet-user[59071]: File: 0.00 Feb 1 03:05:17 localhost puppet-user[59071]: Exec: 0.01 Feb 1 03:05:17 localhost puppet-user[59071]: Augeas: 0.01 Feb 1 03:05:17 localhost puppet-user[59071]: Transaction evaluation: 0.03 Feb 1 03:05:17 localhost puppet-user[59071]: Catalog application: 0.04 Feb 1 03:05:17 localhost puppet-user[59071]: Config retrieval: 0.16 Feb 1 03:05:17 localhost puppet-user[59071]: Last run: 1769933117 Feb 1 03:05:17 localhost puppet-user[59071]: Total: 0.05 Feb 1 03:05:17 localhost puppet-user[59071]: Version: Feb 1 03:05:17 localhost puppet-user[59071]: Config: 1769933116 Feb 1 03:05:17 localhost puppet-user[59071]: Puppet: 7.10.0 Feb 1 03:05:17 localhost ansible-async_wrapper.py[59052]: Module complete (59052) Feb 1 03:05:18 localhost ansible-async_wrapper.py[59051]: Done in kid B. Feb 1 03:05:24 localhost python3[59275]: ansible-ansible.legacy.async_status Invoked with jid=256115419876.59048 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:05:24 localhost python3[59291]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:05:24 localhost python3[59307]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:25 localhost python3[59357]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:25 localhost python3[59375]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpx2e0_wj7 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:05:26 localhost python3[59405]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:27 localhost python3[59508]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 1 03:05:28 localhost python3[59527]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:05:29 localhost systemd[1]: tmp-crun.s4FSzl.mount: Deactivated successfully. Feb 1 03:05:29 localhost podman[59560]: 2026-02-01 08:05:29.382219953 +0000 UTC m=+0.095801765 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64) Feb 1 03:05:29 localhost python3[59559]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:29 localhost podman[59560]: 2026-02-01 08:05:29.590872377 +0000 UTC m=+0.304454199 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:05:29 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:05:30 localhost python3[59639]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:30 localhost python3[59657]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:30 localhost python3[59719]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:31 localhost python3[59737]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:31 localhost python3[59799]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:32 localhost python3[59817]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:32 localhost python3[59879]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:32 localhost python3[59897]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:33 localhost python3[59927]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:05:33 localhost systemd[1]: Reloading. Feb 1 03:05:33 localhost systemd-sysv-generator[59959]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:33 localhost systemd-rc-local-generator[59956]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:34 localhost python3[60014]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:34 localhost python3[60032]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:35 localhost python3[60094]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:05:35 localhost python3[60112]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:35 localhost python3[60142]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:05:35 localhost systemd[1]: Reloading. Feb 1 03:05:35 localhost systemd-rc-local-generator[60170]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:35 localhost systemd-sysv-generator[60173]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:37 localhost systemd[1]: Starting Create netns directory... Feb 1 03:05:37 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 03:05:37 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 03:05:37 localhost systemd[1]: Finished Create netns directory. Feb 1 03:05:37 localhost python3[60200]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 1 03:05:39 localhost python3[60257]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 1 03:05:40 localhost podman[60427]: 2026-02-01 08:05:40.134053398 +0000 UTC m=+0.068912457 container create ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, version=17.1.13, container_name=nova_virtlogd_wrapper, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git) Feb 1 03:05:40 localhost podman[60423]: 2026-02-01 08:05:40.149722474 +0000 UTC m=+0.079413525 container create a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.buildah.version=1.41.5, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:05:40 localhost podman[60443]: 2026-02-01 08:05:40.175769847 +0000 UTC m=+0.100015133 container create b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, architecture=x86_64, container_name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Feb 1 03:05:40 localhost systemd[1]: Started libpod-conmon-a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876.scope. Feb 1 03:05:40 localhost podman[60427]: 2026-02-01 08:05:40.098004392 +0000 UTC m=+0.032863471 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:40 localhost podman[60423]: 2026-02-01 08:05:40.104554981 +0000 UTC m=+0.034246022 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 1 03:05:40 localhost podman[60443]: 2026-02-01 08:05:40.111487212 +0000 UTC m=+0.035732508 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 1 03:05:40 localhost systemd[1]: Started libcrun container. Feb 1 03:05:40 localhost systemd[1]: Started libpod-conmon-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope. Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1bebdb2c854ea17e3835e16ffd23036e649c8de6beed8f121f685f7691f2b861/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost systemd[1]: Started libpod-conmon-ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a.scope. Feb 1 03:05:40 localhost systemd[1]: Started libcrun container. Feb 1 03:05:40 localhost podman[60450]: 2026-02-01 08:05:40.127889711 +0000 UTC m=+0.045575598 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:05:40 localhost podman[60423]: 2026-02-01 08:05:40.229079457 +0000 UTC m=+0.158770498 container init a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_init_log, batch=17.1_20260112.1, config_id=tripleo_step3, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost systemd[1]: Started libcrun container. Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost podman[60443]: 2026-02-01 08:05:40.240392391 +0000 UTC m=+0.164637647 container init b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=) Feb 1 03:05:40 localhost systemd[1]: libpod-a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876.scope: Deactivated successfully. Feb 1 03:05:40 localhost podman[60427]: 2026-02-01 08:05:40.248644762 +0000 UTC m=+0.183503841 container init ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, container_name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 1 03:05:40 localhost podman[60450]: 2026-02-01 08:05:40.254260893 +0000 UTC m=+0.171946760 container create e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:05:40 localhost podman[60427]: 2026-02-01 08:05:40.259765841 +0000 UTC m=+0.194624910 container start ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtlogd_wrapper, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, vcs-type=git, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 1 03:05:40 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:40 localhost podman[60484]: 2026-02-01 08:05:40.275614822 +0000 UTC m=+0.145935589 container create 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 1 03:05:40 localhost podman[60423]: 2026-02-01 08:05:40.292411072 +0000 UTC m=+0.222102133 container start a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_init_log, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5) Feb 1 03:05:40 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Feb 1 03:05:40 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:40 localhost podman[60443]: 2026-02-01 08:05:40.304089527 +0000 UTC m=+0.228334803 container start b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, tcib_managed=true) Feb 1 03:05:40 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=2de9c6a2ee669114248af0484a5abc8a --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 1 03:05:40 localhost systemd[1]: Started libpod-conmon-02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.scope. Feb 1 03:05:40 localhost systemd[1]: Created slice User Slice of UID 0. Feb 1 03:05:40 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 1 03:05:40 localhost podman[60484]: 2026-02-01 08:05:40.229188891 +0000 UTC m=+0.099509678 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 1 03:05:40 localhost systemd[1]: libpod-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope: Deactivated successfully. Feb 1 03:05:40 localhost systemd[1]: Started libpod-conmon-e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07.scope. Feb 1 03:05:40 localhost systemd[1]: Started libcrun container. Feb 1 03:05:40 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 1 03:05:40 localhost systemd[1]: Starting User Manager for UID 0... Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21a54a15634d928a79fc839f0a57c8a326674339b247c00b2db3239d9aad6a92/merged/scripts supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21a54a15634d928a79fc839f0a57c8a326674339b247c00b2db3239d9aad6a92/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost systemd[1]: Started libcrun container. Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a64c0d311ac08180312bae5177c3bf7c3bcda01d21be994fcede7c05d63d280/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a64c0d311ac08180312bae5177c3bf7c3bcda01d21be994fcede7c05d63d280/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a64c0d311ac08180312bae5177c3bf7c3bcda01d21be994fcede7c05d63d280/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost podman[60450]: 2026-02-01 08:05:40.360351729 +0000 UTC m=+0.278037576 container init e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64) Feb 1 03:05:40 localhost podman[60551]: 2026-02-01 08:05:40.374421827 +0000 UTC m=+0.046612679 container died b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, container_name=rsyslog, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, tcib_managed=true, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:05:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:05:40 localhost podman[60484]: 2026-02-01 08:05:40.380833602 +0000 UTC m=+0.251154359 container init 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:05:40 localhost podman[60545]: 2026-02-01 08:05:40.387780462 +0000 UTC m=+0.070338829 container died a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, version=17.1.13, container_name=ceilometer_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible) Feb 1 03:05:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:05:40 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:40 localhost podman[60484]: 2026-02-01 08:05:40.409808322 +0000 UTC m=+0.280129089 container start 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, vcs-type=git) Feb 1 03:05:40 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=da9a0dc7b40588672419e3ce10063e21 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 1 03:05:40 localhost systemd[1]: libpod-e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07.scope: Deactivated successfully. Feb 1 03:05:40 localhost podman[60450]: 2026-02-01 08:05:40.473650474 +0000 UTC m=+0.391336341 container start e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, container_name=nova_statedir_owner, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:05:40 localhost podman[60450]: 2026-02-01 08:05:40.476573362 +0000 UTC m=+0.394259239 container attach e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step3) Feb 1 03:05:40 localhost podman[60516]: 2026-02-01 08:05:40.51757688 +0000 UTC m=+0.262701829 container cleanup a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_init_log, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:05:40 localhost systemd[1]: libpod-conmon-a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876.scope: Deactivated successfully. Feb 1 03:05:40 localhost systemd[60576]: Queued start job for default target Main User Target. Feb 1 03:05:40 localhost systemd[60576]: Created slice User Application Slice. Feb 1 03:05:40 localhost systemd[60576]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 1 03:05:40 localhost systemd[60576]: Started Daily Cleanup of User's Temporary Directories. Feb 1 03:05:40 localhost systemd[60576]: Reached target Paths. Feb 1 03:05:40 localhost systemd[60576]: Reached target Timers. Feb 1 03:05:40 localhost systemd[60576]: Starting D-Bus User Message Bus Socket... Feb 1 03:05:40 localhost systemd[60576]: Starting Create User's Volatile Files and Directories... Feb 1 03:05:40 localhost podman[60616]: 2026-02-01 08:05:40.483678249 +0000 UTC m=+0.067662379 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, managed_by=tripleo_ansible, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=) Feb 1 03:05:40 localhost systemd[60576]: Finished Create User's Volatile Files and Directories. Feb 1 03:05:40 localhost systemd[60576]: Listening on D-Bus User Message Bus Socket. Feb 1 03:05:40 localhost systemd[60576]: Reached target Sockets. Feb 1 03:05:40 localhost systemd[60576]: Reached target Basic System. Feb 1 03:05:40 localhost systemd[60576]: Reached target Main User Target. Feb 1 03:05:40 localhost systemd[60576]: Startup finished in 157ms. Feb 1 03:05:40 localhost systemd[1]: Started User Manager for UID 0. Feb 1 03:05:40 localhost systemd[1]: Started Session c1 of User root. Feb 1 03:05:40 localhost systemd[1]: Started Session c2 of User root. Feb 1 03:05:40 localhost podman[60450]: 2026-02-01 08:05:40.579284976 +0000 UTC m=+0.496970873 container died e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, container_name=nova_statedir_owner, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 1 03:05:40 localhost podman[60574]: 2026-02-01 08:05:40.597415197 +0000 UTC m=+0.252776917 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-rsyslog, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:05:40 localhost systemd[1]: libpod-conmon-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope: Deactivated successfully. Feb 1 03:05:40 localhost podman[60616]: 2026-02-01 08:05:40.615493547 +0000 UTC m=+0.199477727 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:05:40 localhost podman[60616]: unhealthy Feb 1 03:05:40 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:05:40 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Failed with result 'exit-code'. Feb 1 03:05:40 localhost systemd[1]: session-c2.scope: Deactivated successfully. Feb 1 03:05:40 localhost systemd[1]: session-c1.scope: Deactivated successfully. Feb 1 03:05:40 localhost podman[60656]: 2026-02-01 08:05:40.679145612 +0000 UTC m=+0.233410058 container cleanup e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, container_name=nova_statedir_owner, tcib_managed=true, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 1 03:05:40 localhost systemd[1]: libpod-conmon-e06a93d96d2cb13c7ad6e994f03f76e1604943d49b6e85f31d7a491753ebff07.scope: Deactivated successfully. Feb 1 03:05:40 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Feb 1 03:05:40 localhost podman[60801]: 2026-02-01 08:05:40.898651787 +0000 UTC m=+0.062191913 container create 723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=) Feb 1 03:05:40 localhost systemd[1]: Started libpod-conmon-723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5.scope. Feb 1 03:05:40 localhost systemd[1]: Started libcrun container. Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e666fbadd6c4e4928f102dd6c78271a042a375d2964a4c27ffb1b4262f6cceee/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e666fbadd6c4e4928f102dd6c78271a042a375d2964a4c27ffb1b4262f6cceee/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e666fbadd6c4e4928f102dd6c78271a042a375d2964a4c27ffb1b4262f6cceee/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e666fbadd6c4e4928f102dd6c78271a042a375d2964a4c27ffb1b4262f6cceee/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:40 localhost podman[60801]: 2026-02-01 08:05:40.967381816 +0000 UTC m=+0.130921942 container init 723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, release=1766032510, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc.) Feb 1 03:05:40 localhost podman[60801]: 2026-02-01 08:05:40.868443828 +0000 UTC m=+0.031983974 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:40 localhost podman[60801]: 2026-02-01 08:05:40.977660469 +0000 UTC m=+0.141200595 container start 723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 1 03:05:41 localhost podman[60844]: 2026-02-01 08:05:41.050107642 +0000 UTC m=+0.075879618 container create 8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, container_name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:31:49Z) Feb 1 03:05:41 localhost systemd[1]: Started libpod-conmon-8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7.scope. Feb 1 03:05:41 localhost podman[60844]: 2026-02-01 08:05:41.004712622 +0000 UTC m=+0.030484668 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:41 localhost systemd[1]: Started libcrun container. Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost podman[60844]: 2026-02-01 08:05:41.131324522 +0000 UTC m=+0.157096538 container init 8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtsecretd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 1 03:05:41 localhost podman[60844]: 2026-02-01 08:05:41.140972494 +0000 UTC m=+0.166744520 container start 8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, url=https://www.redhat.com, container_name=nova_virtsecretd, architecture=x86_64, tcib_managed=true) Feb 1 03:05:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4-userdata-shm.mount: Deactivated successfully. Feb 1 03:05:41 localhost systemd[1]: var-lib-containers-storage-overlay-1bebdb2c854ea17e3835e16ffd23036e649c8de6beed8f121f685f7691f2b861-merged.mount: Deactivated successfully. Feb 1 03:05:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a045789bd9e3429a97c4110d8cc8f077c8de773afc8661f0b05a6e8b0e149876-userdata-shm.mount: Deactivated successfully. Feb 1 03:05:41 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:41 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:41 localhost systemd[1]: Started Session c3 of User root. Feb 1 03:05:41 localhost systemd[1]: session-c3.scope: Deactivated successfully. Feb 1 03:05:41 localhost podman[60973]: 2026-02-01 08:05:41.56377221 +0000 UTC m=+0.082879951 container create b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, tcib_managed=true) Feb 1 03:05:41 localhost systemd[1]: Started libpod-conmon-b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.scope. Feb 1 03:05:41 localhost podman[60973]: 2026-02-01 08:05:41.525400063 +0000 UTC m=+0.044507914 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 1 03:05:41 localhost podman[60994]: 2026-02-01 08:05:41.624142006 +0000 UTC m=+0.107605193 container create 8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=nova_virtnodedevd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:05:41 localhost systemd[1]: Started libcrun container. Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e75586bb4ab2bd9f0af5c6046e55c6950ec71393a8ae3185df7c4d9365a6d82a/merged/etc/target supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e75586bb4ab2bd9f0af5c6046e55c6950ec71393a8ae3185df7c4d9365a6d82a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost podman[60994]: 2026-02-01 08:05:41.565344099 +0000 UTC m=+0.048807326 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:41 localhost systemd[1]: Started libpod-conmon-8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e.scope. Feb 1 03:05:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:05:41 localhost podman[60973]: 2026-02-01 08:05:41.682725948 +0000 UTC m=+0.201833689 container init b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 1 03:05:41 localhost systemd[1]: Started libcrun container. Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:05:41 localhost podman[60994]: 2026-02-01 08:05:41.703705025 +0000 UTC m=+0.187168182 container init 8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, container_name=nova_virtnodedevd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:05:41 localhost podman[60973]: 2026-02-01 08:05:41.711824272 +0000 UTC m=+0.230932043 container start b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:05:41 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=a46ef4c25933bba0e125120095b56cb6 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 1 03:05:41 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:41 localhost systemd[1]: Started Session c4 of User root. Feb 1 03:05:41 localhost podman[60994]: 2026-02-01 08:05:41.764520225 +0000 UTC m=+0.247983412 container start 8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 03:05:41 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:41 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:41 localhost systemd[1]: Started Session c5 of User root. Feb 1 03:05:41 localhost podman[61023]: 2026-02-01 08:05:41.828844821 +0000 UTC m=+0.108513961 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, config_id=tripleo_step3, tcib_managed=true) Feb 1 03:05:41 localhost systemd[1]: session-c4.scope: Deactivated successfully. Feb 1 03:05:41 localhost kernel: Loading iSCSI transport class v2.0-870. Feb 1 03:05:41 localhost systemd[1]: session-c5.scope: Deactivated successfully. Feb 1 03:05:41 localhost podman[61023]: 2026-02-01 08:05:41.890750153 +0000 UTC m=+0.170419253 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step3) Feb 1 03:05:41 localhost podman[61023]: unhealthy Feb 1 03:05:41 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:05:41 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Failed with result 'exit-code'. Feb 1 03:05:42 localhost podman[61161]: 2026-02-01 08:05:42.385235898 +0000 UTC m=+0.086184301 container create 4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, config_id=tripleo_step3, batch=17.1_20260112.1, container_name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 1 03:05:42 localhost systemd[1]: Started libpod-conmon-4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3.scope. Feb 1 03:05:42 localhost podman[61161]: 2026-02-01 08:05:42.337708193 +0000 UTC m=+0.038656656 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:42 localhost systemd[1]: Started libcrun container. Feb 1 03:05:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:42 localhost podman[61161]: 2026-02-01 08:05:42.490332654 +0000 UTC m=+0.191281087 container init 4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_virtstoraged, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:05:42 localhost podman[61161]: 2026-02-01 08:05:42.505165845 +0000 UTC m=+0.206114268 container start 4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:05:42 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:42 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:42 localhost systemd[1]: Started Session c6 of User root. Feb 1 03:05:42 localhost systemd[1]: session-c6.scope: Deactivated successfully. Feb 1 03:05:42 localhost podman[61266]: 2026-02-01 08:05:42.979368543 +0000 UTC m=+0.080680483 container create 7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_id=tripleo_step3, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtqemud, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:31:49Z) Feb 1 03:05:43 localhost systemd[1]: Started libpod-conmon-7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd.scope. Feb 1 03:05:43 localhost systemd[1]: Started libcrun container. Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost podman[61266]: 2026-02-01 08:05:42.933782017 +0000 UTC m=+0.035094007 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost podman[61266]: 2026-02-01 08:05:43.03976466 +0000 UTC m=+0.141076590 container init 7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_virtqemud, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 03:05:43 localhost podman[61266]: 2026-02-01 08:05:43.049178856 +0000 UTC m=+0.150490786 container start 7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container) Feb 1 03:05:43 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:43 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:43 localhost systemd[1]: Started Session c7 of User root. Feb 1 03:05:43 localhost systemd[1]: session-c7.scope: Deactivated successfully. Feb 1 03:05:43 localhost podman[61373]: 2026-02-01 08:05:43.536918606 +0000 UTC m=+0.085629024 container create aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt) Feb 1 03:05:43 localhost podman[61373]: 2026-02-01 08:05:43.485900415 +0000 UTC m=+0.034610893 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:43 localhost systemd[1]: Started libpod-conmon-aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0.scope. Feb 1 03:05:43 localhost systemd[1]: Started libcrun container. Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:05:43 localhost podman[61373]: 2026-02-01 08:05:43.624081848 +0000 UTC m=+0.172792266 container init aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 03:05:43 localhost podman[61373]: 2026-02-01 08:05:43.6412737 +0000 UTC m=+0.189984118 container start aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, version=17.1.13, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtproxyd, build-date=2026-01-12T23:31:49Z, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, batch=17.1_20260112.1) Feb 1 03:05:43 localhost python3[60257]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:05:43 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:05:43 localhost systemd[1]: Started Session c8 of User root. Feb 1 03:05:43 localhost systemd[1]: session-c8.scope: Deactivated successfully. Feb 1 03:05:44 localhost python3[61453]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:44 localhost python3[61469]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:44 localhost python3[61485]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:45 localhost python3[61501]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:45 localhost python3[61517]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:45 localhost python3[61533]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:45 localhost python3[61549]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:46 localhost python3[61565]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:46 localhost python3[61581]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:46 localhost python3[61597]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:46 localhost python3[61613]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:47 localhost python3[61629]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:47 localhost python3[61645]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:47 localhost python3[61661]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:47 localhost python3[61677]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:48 localhost python3[61693]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:48 localhost python3[61709]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:48 localhost python3[61725]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:05:49 localhost python3[61786]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:49 localhost python3[61815]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:50 localhost python3[61844]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:50 localhost python3[61873]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:51 localhost python3[61902]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:51 localhost python3[61931]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:52 localhost python3[61960]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:52 localhost python3[61989]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:53 localhost python3[62018]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.6150875-99664-181262759837739/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:05:53 localhost python3[62034]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 03:05:53 localhost systemd[1]: Reloading. Feb 1 03:05:53 localhost systemd-rc-local-generator[62057]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:53 localhost systemd-sysv-generator[62061]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:54 localhost systemd[1]: Stopping User Manager for UID 0... Feb 1 03:05:54 localhost systemd[60576]: Activating special unit Exit the Session... Feb 1 03:05:54 localhost systemd[60576]: Stopped target Main User Target. Feb 1 03:05:54 localhost systemd[60576]: Stopped target Basic System. Feb 1 03:05:54 localhost systemd[60576]: Stopped target Paths. Feb 1 03:05:54 localhost systemd[60576]: Stopped target Sockets. Feb 1 03:05:54 localhost systemd[60576]: Stopped target Timers. Feb 1 03:05:54 localhost systemd[60576]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 03:05:54 localhost systemd[60576]: Closed D-Bus User Message Bus Socket. Feb 1 03:05:54 localhost systemd[60576]: Stopped Create User's Volatile Files and Directories. Feb 1 03:05:54 localhost systemd[60576]: Removed slice User Application Slice. Feb 1 03:05:54 localhost systemd[60576]: Reached target Shutdown. Feb 1 03:05:54 localhost systemd[60576]: Finished Exit the Session. Feb 1 03:05:54 localhost systemd[60576]: Reached target Exit the Session. Feb 1 03:05:54 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 1 03:05:54 localhost systemd[1]: Stopped User Manager for UID 0. Feb 1 03:05:54 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 1 03:05:54 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 1 03:05:54 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 1 03:05:54 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 1 03:05:54 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 1 03:05:54 localhost python3[62087]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:05:54 localhost systemd[1]: Reloading. Feb 1 03:05:54 localhost systemd-rc-local-generator[62113]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:54 localhost systemd-sysv-generator[62119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:55 localhost systemd[1]: Starting collectd container... Feb 1 03:05:55 localhost systemd[1]: Started collectd container. Feb 1 03:05:55 localhost python3[62153]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:05:55 localhost systemd[1]: Reloading. Feb 1 03:05:55 localhost systemd-rc-local-generator[62177]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:55 localhost systemd-sysv-generator[62184]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:56 localhost systemd[1]: Starting iscsid container... Feb 1 03:05:56 localhost systemd[1]: Started iscsid container. Feb 1 03:05:56 localhost python3[62220]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:05:56 localhost systemd[1]: Reloading. Feb 1 03:05:56 localhost systemd-rc-local-generator[62251]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:56 localhost systemd-sysv-generator[62254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:57 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Feb 1 03:05:57 localhost systemd[1]: Started nova_virtlogd_wrapper container. Feb 1 03:05:57 localhost python3[62288]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:05:57 localhost systemd[1]: Reloading. Feb 1 03:05:57 localhost systemd-rc-local-generator[62315]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:57 localhost systemd-sysv-generator[62319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:57 localhost systemd[1]: Starting nova_virtnodedevd container... Feb 1 03:05:58 localhost tripleo-start-podman-container[62327]: Creating additional drop-in dependency for "nova_virtnodedevd" (8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e) Feb 1 03:05:58 localhost systemd[1]: Reloading. Feb 1 03:05:58 localhost systemd-sysv-generator[62386]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:58 localhost systemd-rc-local-generator[62382]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:58 localhost systemd[1]: Started nova_virtnodedevd container. Feb 1 03:05:59 localhost python3[62410]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:05:59 localhost systemd[1]: Reloading. Feb 1 03:05:59 localhost systemd-rc-local-generator[62440]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:59 localhost systemd-sysv-generator[62443]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:59 localhost systemd[1]: Starting nova_virtproxyd container... Feb 1 03:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:05:59 localhost systemd[1]: tmp-crun.u61pkX.mount: Deactivated successfully. Feb 1 03:05:59 localhost podman[62462]: 2026-02-01 08:05:59.739192579 +0000 UTC m=+0.094252457 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1) Feb 1 03:05:59 localhost tripleo-start-podman-container[62450]: Creating additional drop-in dependency for "nova_virtproxyd" (aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0) Feb 1 03:05:59 localhost systemd[1]: Reloading. Feb 1 03:05:59 localhost systemd-sysv-generator[62540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:05:59 localhost systemd-rc-local-generator[62537]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:05:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:05:59 localhost podman[62462]: 2026-02-01 08:05:59.94642892 +0000 UTC m=+0.301488818 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:06:00 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:06:00 localhost systemd[1]: Started nova_virtproxyd container. Feb 1 03:06:00 localhost python3[62565]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:06:00 localhost systemd[1]: Reloading. Feb 1 03:06:00 localhost systemd-rc-local-generator[62590]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:06:00 localhost systemd-sysv-generator[62595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:06:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:06:01 localhost systemd[1]: Starting nova_virtqemud container... Feb 1 03:06:01 localhost tripleo-start-podman-container[62605]: Creating additional drop-in dependency for "nova_virtqemud" (7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd) Feb 1 03:06:01 localhost systemd[1]: Reloading. Feb 1 03:06:01 localhost systemd-sysv-generator[62663]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:06:01 localhost systemd-rc-local-generator[62660]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:06:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:06:01 localhost systemd[1]: Started nova_virtqemud container. Feb 1 03:06:02 localhost python3[62689]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:06:03 localhost systemd[1]: Reloading. Feb 1 03:06:03 localhost systemd-rc-local-generator[62714]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:06:03 localhost systemd-sysv-generator[62720]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:06:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:06:03 localhost systemd[1]: Starting nova_virtsecretd container... Feb 1 03:06:04 localhost tripleo-start-podman-container[62729]: Creating additional drop-in dependency for "nova_virtsecretd" (8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7) Feb 1 03:06:04 localhost systemd[1]: Reloading. Feb 1 03:06:04 localhost systemd-rc-local-generator[62790]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:06:04 localhost systemd-sysv-generator[62794]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:06:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:06:04 localhost systemd[1]: Started nova_virtsecretd container. Feb 1 03:06:05 localhost python3[62814]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:06:05 localhost sshd[62817]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:06:05 localhost systemd[1]: Reloading. Feb 1 03:06:05 localhost systemd-rc-local-generator[62842]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:06:05 localhost systemd-sysv-generator[62845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:06:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:06:05 localhost systemd[1]: Starting nova_virtstoraged container... Feb 1 03:06:05 localhost tripleo-start-podman-container[62855]: Creating additional drop-in dependency for "nova_virtstoraged" (4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3) Feb 1 03:06:06 localhost systemd[1]: Reloading. Feb 1 03:06:06 localhost systemd-rc-local-generator[62911]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:06:06 localhost systemd-sysv-generator[62916]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:06:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:06:06 localhost systemd[1]: Started nova_virtstoraged container. Feb 1 03:06:06 localhost python3[62938]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:06:07 localhost systemd[1]: Reloading. Feb 1 03:06:07 localhost systemd-rc-local-generator[62968]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:06:07 localhost systemd-sysv-generator[62971]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:06:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:06:07 localhost systemd[1]: Starting rsyslog container... Feb 1 03:06:07 localhost systemd[1]: tmp-crun.XLU5ox.mount: Deactivated successfully. Feb 1 03:06:07 localhost systemd[1]: Started libcrun container. Feb 1 03:06:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:07 localhost podman[62978]: 2026-02-01 08:06:07.533338711 +0000 UTC m=+0.126376484 container init b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public) Feb 1 03:06:07 localhost podman[62978]: 2026-02-01 08:06:07.540791418 +0000 UTC m=+0.133829181 container start b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, com.redhat.component=openstack-rsyslog-container) Feb 1 03:06:07 localhost podman[62978]: rsyslog Feb 1 03:06:07 localhost systemd[1]: Started rsyslog container. Feb 1 03:06:07 localhost systemd[1]: libpod-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope: Deactivated successfully. Feb 1 03:06:07 localhost podman[63009]: 2026-02-01 08:06:07.672678428 +0000 UTC m=+0.044391251 container died b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, build-date=2026-01-12T22:10:09Z, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, container_name=rsyslog, vendor=Red Hat, Inc.) Feb 1 03:06:07 localhost podman[63009]: 2026-02-01 08:06:07.693207622 +0000 UTC m=+0.064920435 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, container_name=rsyslog, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, io.openshift.expose-services=, version=17.1.13) Feb 1 03:06:07 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:06:07 localhost podman[63030]: 2026-02-01 08:06:07.757214288 +0000 UTC m=+0.033695366 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog) Feb 1 03:06:07 localhost podman[63030]: rsyslog Feb 1 03:06:07 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 1 03:06:07 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Feb 1 03:06:07 localhost systemd[1]: Stopped rsyslog container. Feb 1 03:06:07 localhost systemd[1]: Starting rsyslog container... Feb 1 03:06:07 localhost systemd[1]: Started libcrun container. Feb 1 03:06:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:08 localhost podman[63057]: 2026-02-01 08:06:08.016674758 +0000 UTC m=+0.125565550 container init b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:06:08 localhost podman[63057]: 2026-02-01 08:06:08.026699322 +0000 UTC m=+0.135590114 container start b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-rsyslog-container, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true) Feb 1 03:06:08 localhost podman[63057]: rsyslog Feb 1 03:06:08 localhost systemd[1]: Started rsyslog container. Feb 1 03:06:08 localhost python3[63063]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:06:08 localhost systemd[1]: libpod-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope: Deactivated successfully. Feb 1 03:06:08 localhost podman[63080]: 2026-02-01 08:06:08.211857313 +0000 UTC m=+0.057957394 container died b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:06:08 localhost podman[63080]: 2026-02-01 08:06:08.233283383 +0000 UTC m=+0.079383404 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 1 03:06:08 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:06:08 localhost podman[63093]: 2026-02-01 08:06:08.305780198 +0000 UTC m=+0.043457442 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step3, container_name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:06:08 localhost podman[63093]: rsyslog Feb 1 03:06:08 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 1 03:06:08 localhost systemd[1]: var-lib-containers-storage-overlay-cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e-merged.mount: Deactivated successfully. Feb 1 03:06:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4-userdata-shm.mount: Deactivated successfully. Feb 1 03:06:08 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Feb 1 03:06:08 localhost systemd[1]: Stopped rsyslog container. Feb 1 03:06:08 localhost systemd[1]: Starting rsyslog container... Feb 1 03:06:08 localhost systemd[1]: Started libcrun container. Feb 1 03:06:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:08 localhost podman[63150]: 2026-02-01 08:06:08.588802624 +0000 UTC m=+0.101622591 container init b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, release=1766032510) Feb 1 03:06:08 localhost podman[63150]: 2026-02-01 08:06:08.598334653 +0000 UTC m=+0.111154240 container start b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Feb 1 03:06:08 localhost podman[63150]: rsyslog Feb 1 03:06:08 localhost systemd[1]: Started rsyslog container. Feb 1 03:06:08 localhost systemd[1]: libpod-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope: Deactivated successfully. Feb 1 03:06:08 localhost podman[63180]: 2026-02-01 08:06:08.757801372 +0000 UTC m=+0.039786611 container died b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public) Feb 1 03:06:08 localhost podman[63180]: 2026-02-01 08:06:08.781263506 +0000 UTC m=+0.063248685 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.created=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z) Feb 1 03:06:08 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:06:08 localhost podman[63215]: 2026-02-01 08:06:08.888353022 +0000 UTC m=+0.079089216 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:09Z, build-date=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp-rhel9/openstack-rsyslog, tcib_managed=true, container_name=rsyslog, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:06:08 localhost podman[63215]: rsyslog Feb 1 03:06:08 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 1 03:06:09 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Feb 1 03:06:09 localhost systemd[1]: Stopped rsyslog container. Feb 1 03:06:09 localhost systemd[1]: Starting rsyslog container... Feb 1 03:06:09 localhost systemd[1]: Started libcrun container. Feb 1 03:06:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:09 localhost podman[63255]: 2026-02-01 08:06:09.275094411 +0000 UTC m=+0.129545119 container init b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, distribution-scope=public, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, url=https://www.redhat.com) Feb 1 03:06:09 localhost podman[63255]: 2026-02-01 08:06:09.28292642 +0000 UTC m=+0.137377078 container start b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 1 03:06:09 localhost podman[63255]: rsyslog Feb 1 03:06:09 localhost systemd[1]: Started rsyslog container. Feb 1 03:06:09 localhost systemd[1]: libpod-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope: Deactivated successfully. Feb 1 03:06:09 localhost podman[63297]: 2026-02-01 08:06:09.449168954 +0000 UTC m=+0.053896479 container died b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-rsyslog, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog) Feb 1 03:06:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4-userdata-shm.mount: Deactivated successfully. Feb 1 03:06:09 localhost systemd[1]: var-lib-containers-storage-overlay-cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e-merged.mount: Deactivated successfully. Feb 1 03:06:09 localhost podman[63297]: 2026-02-01 08:06:09.480889759 +0000 UTC m=+0.085617234 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container) Feb 1 03:06:09 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:06:09 localhost python3[63294]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005604212 step=3 update_config_hash_only=False Feb 1 03:06:09 localhost podman[63312]: 2026-02-01 08:06:09.574684601 +0000 UTC m=+0.066545855 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:09Z, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Feb 1 03:06:09 localhost podman[63312]: rsyslog Feb 1 03:06:09 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 1 03:06:09 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Feb 1 03:06:09 localhost systemd[1]: Stopped rsyslog container. Feb 1 03:06:09 localhost systemd[1]: Starting rsyslog container... Feb 1 03:06:10 localhost systemd[1]: Started libcrun container. Feb 1 03:06:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 1 03:06:10 localhost podman[63340]: 2026-02-01 08:06:10.014454343 +0000 UTC m=+0.123889448 container init b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, container_name=rsyslog, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:09Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z) Feb 1 03:06:10 localhost podman[63340]: 2026-02-01 08:06:10.021517648 +0000 UTC m=+0.130952763 container start b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:09Z, batch=17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13) Feb 1 03:06:10 localhost podman[63340]: rsyslog Feb 1 03:06:10 localhost systemd[1]: Started rsyslog container. Feb 1 03:06:10 localhost python3[63339]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:06:10 localhost systemd[1]: libpod-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4.scope: Deactivated successfully. Feb 1 03:06:10 localhost podman[63363]: 2026-02-01 08:06:10.168059573 +0000 UTC m=+0.030462077 container died b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1766032510, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:09Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog) Feb 1 03:06:10 localhost podman[63363]: 2026-02-01 08:06:10.186492413 +0000 UTC m=+0.048894897 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, container_name=rsyslog, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Feb 1 03:06:10 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:06:10 localhost podman[63376]: 2026-02-01 08:06:10.260297868 +0000 UTC m=+0.053157447 container cleanup b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.openshift.expose-services=, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=rsyslog, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '2de9c6a2ee669114248af0484a5abc8a'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, build-date=2026-01-12T22:10:09Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:06:10 localhost podman[63376]: rsyslog Feb 1 03:06:10 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 1 03:06:10 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Feb 1 03:06:10 localhost systemd[1]: Stopped rsyslog container. Feb 1 03:06:10 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Feb 1 03:06:10 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 1 03:06:10 localhost systemd[1]: Failed to start rsyslog container. Feb 1 03:06:10 localhost python3[63404]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 1 03:06:10 localhost systemd[1]: var-lib-containers-storage-overlay-cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e-merged.mount: Deactivated successfully. Feb 1 03:06:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b86c86db7cfdd05b500b98bedf76ac07861490376223bb73e33ba09a75de07d4-userdata-shm.mount: Deactivated successfully. Feb 1 03:06:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:06:11 localhost systemd[1]: tmp-crun.lu0AZh.mount: Deactivated successfully. Feb 1 03:06:11 localhost podman[63405]: 2026-02-01 08:06:11.7176145 +0000 UTC m=+0.082457538 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z) Feb 1 03:06:11 localhost podman[63405]: 2026-02-01 08:06:11.728514341 +0000 UTC m=+0.093357369 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true) Feb 1 03:06:11 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:06:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:06:12 localhost podman[63426]: 2026-02-01 08:06:12.725365031 +0000 UTC m=+0.084967905 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:06:12 localhost podman[63426]: 2026-02-01 08:06:12.761641445 +0000 UTC m=+0.121244299 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510) Feb 1 03:06:12 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:06:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:06:30 localhost systemd[1]: tmp-crun.DRym1f.mount: Deactivated successfully. Feb 1 03:06:30 localhost podman[63524]: 2026-02-01 08:06:30.745161414 +0000 UTC m=+0.097279868 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, vcs-type=git, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:06:30 localhost podman[63524]: 2026-02-01 08:06:30.926170088 +0000 UTC m=+0.278288512 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Feb 1 03:06:30 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:06:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:06:42 localhost systemd[1]: tmp-crun.vDYZu8.mount: Deactivated successfully. Feb 1 03:06:42 localhost podman[63552]: 2026-02-01 08:06:42.734400856 +0000 UTC m=+0.095710169 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3) Feb 1 03:06:42 localhost podman[63552]: 2026-02-01 08:06:42.749357378 +0000 UTC m=+0.110666731 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, release=1766032510) Feb 1 03:06:42 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:06:43 localhost systemd[1]: tmp-crun.8QINc1.mount: Deactivated successfully. Feb 1 03:06:43 localhost podman[63572]: 2026-02-01 08:06:43.726123607 +0000 UTC m=+0.088086724 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, container_name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=) Feb 1 03:06:43 localhost podman[63572]: 2026-02-01 08:06:43.761638145 +0000 UTC m=+0.123601292 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid) Feb 1 03:06:43 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:07:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:07:01 localhost podman[63591]: 2026-02-01 08:07:01.717537441 +0000 UTC m=+0.079357803 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5) Feb 1 03:07:01 localhost podman[63591]: 2026-02-01 08:07:01.904298794 +0000 UTC m=+0.266119136 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, release=1766032510) Feb 1 03:07:01 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:07:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:07:13 localhost podman[63620]: 2026-02-01 08:07:13.73243395 +0000 UTC m=+0.091782958 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container) Feb 1 03:07:13 localhost podman[63620]: 2026-02-01 08:07:13.740812989 +0000 UTC m=+0.100161977 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:07:13 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:07:14 localhost podman[63639]: 2026-02-01 08:07:14.746903934 +0000 UTC m=+0.103624964 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:07:14 localhost podman[63639]: 2026-02-01 08:07:14.761405012 +0000 UTC m=+0.118126072 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=iscsid, io.buildah.version=1.41.5, distribution-scope=public) Feb 1 03:07:14 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:07:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:07:32 localhost podman[63733]: 2026-02-01 08:07:32.721445387 +0000 UTC m=+0.084330888 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1766032510) Feb 1 03:07:32 localhost podman[63733]: 2026-02-01 08:07:32.940097426 +0000 UTC m=+0.302982947 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, tcib_managed=true, release=1766032510, build-date=2026-01-12T22:10:14Z) Feb 1 03:07:32 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:07:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:07:44 localhost podman[63762]: 2026-02-01 08:07:44.732937919 +0000 UTC m=+0.095495723 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, container_name=collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, architecture=x86_64) Feb 1 03:07:44 localhost podman[63762]: 2026-02-01 08:07:44.769174849 +0000 UTC m=+0.131732643 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd) Feb 1 03:07:44 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:07:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:07:44 localhost podman[63780]: 2026-02-01 08:07:44.869292753 +0000 UTC m=+0.067980171 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 1 03:07:44 localhost podman[63780]: 2026-02-01 08:07:44.901763886 +0000 UTC m=+0.100451234 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, container_name=iscsid) Feb 1 03:07:44 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:08:04 localhost podman[63798]: 2026-02-01 08:08:04.007394057 +0000 UTC m=+0.367083466 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:08:04 localhost podman[63798]: 2026-02-01 08:08:04.235722575 +0000 UTC m=+0.595411994 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 1 03:08:04 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:08:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:08:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:08:15 localhost podman[63828]: 2026-02-01 08:08:15.721889641 +0000 UTC m=+0.080746786 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step3, release=1766032510) Feb 1 03:08:15 localhost systemd[1]: tmp-crun.cjEYWu.mount: Deactivated successfully. Feb 1 03:08:15 localhost podman[63829]: 2026-02-01 08:08:15.782904717 +0000 UTC m=+0.137058337 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:08:15 localhost podman[63828]: 2026-02-01 08:08:15.808366734 +0000 UTC m=+0.167223839 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team) Feb 1 03:08:15 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:08:15 localhost podman[63829]: 2026-02-01 08:08:15.8188848 +0000 UTC m=+0.173038410 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 1 03:08:15 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:08:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:08:34 localhost podman[63946]: 2026-02-01 08:08:34.720674931 +0000 UTC m=+0.083604265 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:08:34 localhost podman[63946]: 2026-02-01 08:08:34.919335551 +0000 UTC m=+0.282264935 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:08:34 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:08:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:08:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:08:46 localhost podman[63975]: 2026-02-01 08:08:46.725669171 +0000 UTC m=+0.084260675 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true) Feb 1 03:08:46 localhost podman[63975]: 2026-02-01 08:08:46.762327804 +0000 UTC m=+0.120919238 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc.) Feb 1 03:08:46 localhost systemd[1]: tmp-crun.gYYVpd.mount: Deactivated successfully. Feb 1 03:08:46 localhost podman[63976]: 2026-02-01 08:08:46.772428277 +0000 UTC m=+0.126861282 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:08:46 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:08:46 localhost podman[63976]: 2026-02-01 08:08:46.78450651 +0000 UTC m=+0.138939535 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:08:46 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:09:04 localhost sshd[64014]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:09:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:09:05 localhost podman[64016]: 2026-02-01 08:09:05.723534415 +0000 UTC m=+0.084562955 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public) Feb 1 03:09:05 localhost podman[64016]: 2026-02-01 08:09:05.947230159 +0000 UTC m=+0.308258699 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:09:05 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:09:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:09:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:09:17 localhost podman[64048]: 2026-02-01 08:09:17.731376724 +0000 UTC m=+0.086794614 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3) Feb 1 03:09:17 localhost podman[64048]: 2026-02-01 08:09:17.772403752 +0000 UTC m=+0.127821632 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.expose-services=, container_name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible) Feb 1 03:09:17 localhost podman[64047]: 2026-02-01 08:09:17.783232177 +0000 UTC m=+0.138825882 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5) Feb 1 03:09:17 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:09:17 localhost podman[64047]: 2026-02-01 08:09:17.792205084 +0000 UTC m=+0.147798799 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:09:17 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:09:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:09:36 localhost podman[64163]: 2026-02-01 08:09:36.727408099 +0000 UTC m=+0.089835796 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 1 03:09:36 localhost podman[64163]: 2026-02-01 08:09:36.941777696 +0000 UTC m=+0.304205403 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, architecture=x86_64) Feb 1 03:09:36 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:09:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:09:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:09:48 localhost podman[64194]: 2026-02-01 08:09:48.724584489 +0000 UTC m=+0.082148474 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:09:48 localhost podman[64194]: 2026-02-01 08:09:48.762118477 +0000 UTC m=+0.119682493 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team) Feb 1 03:09:48 localhost podman[64193]: 2026-02-01 08:09:48.773056129 +0000 UTC m=+0.130430387 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:09:48 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:09:48 localhost podman[64193]: 2026-02-01 08:09:48.783532235 +0000 UTC m=+0.140906533 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:09:48 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:10:04 localhost python3[64274]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:04 localhost python3[64319]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933403.8550024-106826-199025856396694/source _original_basename=tmp4bkybolo follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:05 localhost python3[64381]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:06 localhost python3[64424]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933405.5747032-106923-218178475684129/source _original_basename=tmplcdykcci follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:06 localhost python3[64486]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:10:07 localhost systemd[1]: tmp-crun.IaqUAR.mount: Deactivated successfully. Feb 1 03:10:07 localhost podman[64530]: 2026-02-01 08:10:07.138423659 +0000 UTC m=+0.092360830 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, url=https://www.redhat.com) Feb 1 03:10:07 localhost python3[64529]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933406.4690678-106977-14430107314213/source _original_basename=tmprhzx_tm7 follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:07 localhost podman[64530]: 2026-02-01 08:10:07.36442085 +0000 UTC m=+0.318357971 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 1 03:10:07 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:10:07 localhost python3[64617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:08 localhost python3[64660]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933407.4074082-107039-269532121669937/source _original_basename=tmpjwrxzqw5 follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:08 localhost python3[64690]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 1 03:10:08 localhost systemd[1]: Reloading. Feb 1 03:10:08 localhost systemd-sysv-generator[64719]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:08 localhost systemd-rc-local-generator[64713]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:09 localhost systemd[1]: Reloading. Feb 1 03:10:09 localhost systemd-sysv-generator[64757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:09 localhost systemd-rc-local-generator[64753]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:09 localhost python3[64780]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:10:09 localhost systemd[1]: Reloading. Feb 1 03:10:10 localhost systemd-sysv-generator[64808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:10 localhost systemd-rc-local-generator[64803]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:10 localhost systemd[1]: Reloading. Feb 1 03:10:10 localhost systemd-rc-local-generator[64842]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:10 localhost systemd-sysv-generator[64847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:10 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Feb 1 03:10:10 localhost python3[64870]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 03:10:11 localhost systemd[1]: Reloading. Feb 1 03:10:11 localhost systemd-rc-local-generator[64899]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:11 localhost systemd-sysv-generator[64903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:11 localhost python3[64955]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:12 localhost python3[64998]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933411.5033426-107183-9158212727213/source _original_basename=tmpgqu0jed8 follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:12 localhost python3[65028]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:10:12 localhost systemd[1]: Reloading. Feb 1 03:10:12 localhost systemd-sysv-generator[65059]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:12 localhost systemd-rc-local-generator[65055]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:13 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Feb 1 03:10:13 localhost python3[65083]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:10:15 localhost ansible-async_wrapper.py[65255]: Invoked with 611317420877 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933414.6784697-107316-247662890034371/AnsiballZ_command.py _ Feb 1 03:10:15 localhost ansible-async_wrapper.py[65258]: Starting module and watcher Feb 1 03:10:15 localhost ansible-async_wrapper.py[65258]: Start watching 65259 (3600) Feb 1 03:10:15 localhost ansible-async_wrapper.py[65259]: Start module (65259) Feb 1 03:10:15 localhost ansible-async_wrapper.py[65255]: Return async_wrapper task started. Feb 1 03:10:15 localhost python3[65279]: ansible-ansible.legacy.async_status Invoked with jid=611317420877.65255 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:10:19 localhost puppet-user[65266]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 03:10:19 localhost puppet-user[65266]: (file: /etc/puppet/hiera.yaml) Feb 1 03:10:19 localhost puppet-user[65266]: Warning: Undefined variable '::deploy_config_name'; Feb 1 03:10:19 localhost puppet-user[65266]: (file & line not available) Feb 1 03:10:19 localhost puppet-user[65266]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 03:10:19 localhost puppet-user[65266]: (file & line not available) Feb 1 03:10:19 localhost puppet-user[65266]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 1 03:10:19 localhost puppet-user[65266]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:10:19 localhost puppet-user[65266]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:10:19 localhost puppet-user[65266]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:10:19 localhost puppet-user[65266]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:10:19 localhost puppet-user[65266]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:10:19 localhost puppet-user[65266]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:10:19 localhost puppet-user[65266]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:10:19 localhost puppet-user[65266]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:10:19 localhost puppet-user[65266]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:10:19 localhost puppet-user[65266]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:10:19 localhost puppet-user[65266]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:10:19 localhost puppet-user[65266]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:10:19 localhost puppet-user[65266]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:10:19 localhost puppet-user[65266]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:10:19 localhost puppet-user[65266]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:10:19 localhost puppet-user[65266]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:10:19 localhost puppet-user[65266]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:10:19 localhost puppet-user[65266]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 1 03:10:19 localhost puppet-user[65266]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.28 seconds Feb 1 03:10:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:10:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:10:19 localhost podman[65397]: 2026-02-01 08:10:19.742643248 +0000 UTC m=+0.094609225 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:10:19 localhost podman[65397]: 2026-02-01 08:10:19.760435795 +0000 UTC m=+0.112401842 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, version=17.1.13, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z) Feb 1 03:10:19 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:10:19 localhost podman[65398]: 2026-02-01 08:10:19.82570867 +0000 UTC m=+0.178266326 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:10:19 localhost podman[65398]: 2026-02-01 08:10:19.864389177 +0000 UTC m=+0.216946803 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid) Feb 1 03:10:19 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:10:20 localhost ansible-async_wrapper.py[65258]: 65259 still running (3600) Feb 1 03:10:25 localhost ansible-async_wrapper.py[65258]: 65259 still running (3595) Feb 1 03:10:25 localhost python3[65470]: ansible-ansible.legacy.async_status Invoked with jid=611317420877.65255 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:10:28 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 03:10:28 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 03:10:28 localhost systemd[1]: Reloading. Feb 1 03:10:28 localhost systemd-rc-local-generator[65603]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:28 localhost systemd-sysv-generator[65608]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:29 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 03:10:29 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 03:10:29 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 03:10:29 localhost systemd[1]: man-db-cache-update.service: Consumed 1.208s CPU time. Feb 1 03:10:29 localhost systemd[1]: run-r8e46d43ddd864b1d98a9edb66ba8ca3b.service: Deactivated successfully. Feb 1 03:10:30 localhost ansible-async_wrapper.py[65258]: 65259 still running (3590) Feb 1 03:10:30 localhost puppet-user[65266]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Feb 1 03:10:30 localhost puppet-user[65266]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}08191e1608782ed50454f55fa73738a60685476a560a18d5ee77b4a11ecc46f2' Feb 1 03:10:30 localhost puppet-user[65266]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Feb 1 03:10:30 localhost puppet-user[65266]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Feb 1 03:10:30 localhost puppet-user[65266]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Feb 1 03:10:30 localhost puppet-user[65266]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Feb 1 03:10:35 localhost ansible-async_wrapper.py[65258]: 65259 still running (3585) Feb 1 03:10:35 localhost puppet-user[65266]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Feb 1 03:10:35 localhost systemd[1]: Reloading. Feb 1 03:10:35 localhost systemd-sysv-generator[66771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:35 localhost systemd-rc-local-generator[66768]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:36 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Feb 1 03:10:36 localhost snmpd[66800]: Can't find directory of RPM packages Feb 1 03:10:36 localhost snmpd[66800]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Feb 1 03:10:36 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Feb 1 03:10:36 localhost systemd[1]: Reloading. Feb 1 03:10:36 localhost python3[66799]: ansible-ansible.legacy.async_status Invoked with jid=611317420877.65255 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:10:36 localhost systemd-sysv-generator[66827]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:36 localhost systemd-rc-local-generator[66824]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:36 localhost systemd[1]: Reloading. Feb 1 03:10:36 localhost systemd-sysv-generator[66866]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:36 localhost systemd-rc-local-generator[66863]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:36 localhost puppet-user[65266]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Feb 1 03:10:36 localhost puppet-user[65266]: Notice: Applied catalog in 17.44 seconds Feb 1 03:10:36 localhost puppet-user[65266]: Application: Feb 1 03:10:36 localhost puppet-user[65266]: Initial environment: production Feb 1 03:10:36 localhost puppet-user[65266]: Converged environment: production Feb 1 03:10:36 localhost puppet-user[65266]: Run mode: user Feb 1 03:10:36 localhost puppet-user[65266]: Changes: Feb 1 03:10:36 localhost puppet-user[65266]: Total: 8 Feb 1 03:10:36 localhost puppet-user[65266]: Events: Feb 1 03:10:36 localhost puppet-user[65266]: Success: 8 Feb 1 03:10:36 localhost puppet-user[65266]: Total: 8 Feb 1 03:10:36 localhost puppet-user[65266]: Resources: Feb 1 03:10:36 localhost puppet-user[65266]: Restarted: 1 Feb 1 03:10:36 localhost puppet-user[65266]: Changed: 8 Feb 1 03:10:36 localhost puppet-user[65266]: Out of sync: 8 Feb 1 03:10:36 localhost puppet-user[65266]: Total: 19 Feb 1 03:10:36 localhost puppet-user[65266]: Time: Feb 1 03:10:36 localhost puppet-user[65266]: Filebucket: 0.00 Feb 1 03:10:36 localhost puppet-user[65266]: Schedule: 0.00 Feb 1 03:10:36 localhost puppet-user[65266]: Augeas: 0.01 Feb 1 03:10:36 localhost puppet-user[65266]: File: 0.09 Feb 1 03:10:36 localhost puppet-user[65266]: Config retrieval: 0.36 Feb 1 03:10:36 localhost puppet-user[65266]: Service: 1.20 Feb 1 03:10:36 localhost puppet-user[65266]: Package: 10.85 Feb 1 03:10:36 localhost puppet-user[65266]: Transaction evaluation: 17.43 Feb 1 03:10:36 localhost puppet-user[65266]: Catalog application: 17.44 Feb 1 03:10:36 localhost puppet-user[65266]: Last run: 1769933436 Feb 1 03:10:36 localhost puppet-user[65266]: Exec: 5.06 Feb 1 03:10:36 localhost puppet-user[65266]: Total: 17.45 Feb 1 03:10:36 localhost puppet-user[65266]: Version: Feb 1 03:10:36 localhost puppet-user[65266]: Config: 1769933419 Feb 1 03:10:36 localhost puppet-user[65266]: Puppet: 7.10.0 Feb 1 03:10:36 localhost ansible-async_wrapper.py[65259]: Module complete (65259) Feb 1 03:10:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:10:37 localhost podman[66873]: 2026-02-01 08:10:37.720677672 +0000 UTC m=+0.076265470 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, release=1766032510, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true) Feb 1 03:10:37 localhost podman[66873]: 2026-02-01 08:10:37.964087447 +0000 UTC m=+0.319675235 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:10:37 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:10:40 localhost ansible-async_wrapper.py[65258]: Done in kid B. Feb 1 03:10:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:10:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 5184 writes, 23K keys, 5184 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5184 writes, 559 syncs, 9.27 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 86 writes, 128 keys, 86 commit groups, 1.0 writes per commit group, ingest: 0.04 MB, 0.00 MB/s#012Interval WAL: 86 writes, 43 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:10:46 localhost python3[66918]: ansible-ansible.legacy.async_status Invoked with jid=611317420877.65255 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:10:47 localhost python3[66934]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:10:47 localhost python3[66950]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:10:48 localhost python3[67000]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:48 localhost python3[67018]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpmn9gu1hk recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:10:48 localhost python3[67048]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:10:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4440 writes, 20K keys, 4440 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4440 writes, 499 syncs, 8.90 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 28 writes, 55 keys, 28 commit groups, 1.0 writes per commit group, ingest: 0.02 MB, 0.00 MB/s#012Interval WAL: 28 writes, 14 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:10:49 localhost systemd[1]: tmp-crun.so513p.mount: Deactivated successfully. Feb 1 03:10:49 localhost podman[67152]: 2026-02-01 08:10:49.987117429 +0000 UTC m=+0.086031640 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13) Feb 1 03:10:50 localhost podman[67152]: 2026-02-01 08:10:50.002852439 +0000 UTC m=+0.101766610 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 1 03:10:50 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:10:50 localhost python3[67151]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 1 03:10:50 localhost systemd[1]: tmp-crun.xZTGEA.mount: Deactivated successfully. Feb 1 03:10:50 localhost podman[67153]: 2026-02-01 08:10:50.083374767 +0000 UTC m=+0.179870909 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z) Feb 1 03:10:50 localhost podman[67153]: 2026-02-01 08:10:50.098356122 +0000 UTC m=+0.194852254 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible) Feb 1 03:10:50 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:10:50 localhost python3[67208]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:51 localhost python3[67240]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:10:52 localhost python3[67290]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:52 localhost python3[67308]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:53 localhost python3[67370]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:53 localhost python3[67388]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:53 localhost python3[67450]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:54 localhost python3[67468]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:54 localhost python3[67530]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:55 localhost python3[67548]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:55 localhost python3[67578]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:10:55 localhost systemd[1]: Reloading. Feb 1 03:10:55 localhost systemd-sysv-generator[67605]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:55 localhost systemd-rc-local-generator[67599]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:56 localhost python3[67664]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:56 localhost python3[67682]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:57 localhost python3[67744]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:10:57 localhost python3[67762]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:10:58 localhost python3[67792]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:10:58 localhost systemd[1]: Reloading. Feb 1 03:10:58 localhost systemd-sysv-generator[67819]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:10:58 localhost systemd-rc-local-generator[67815]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:10:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:10:58 localhost systemd[1]: Starting Create netns directory... Feb 1 03:10:58 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 03:10:58 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 03:10:58 localhost systemd[1]: Finished Create netns directory. Feb 1 03:10:59 localhost python3[67848]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 1 03:11:00 localhost python3[67906]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 1 03:11:01 localhost podman[68063]: 2026-02-01 08:11:01.035633233 +0000 UTC m=+0.071569533 container create d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=configure_cms_options, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4) Feb 1 03:11:01 localhost podman[68073]: 2026-02-01 08:11:01.066496561 +0000 UTC m=+0.092608888 container create 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible) Feb 1 03:11:01 localhost systemd[1]: Started libpod-conmon-d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a.scope. Feb 1 03:11:01 localhost podman[68092]: 2026-02-01 08:11:01.091709644 +0000 UTC m=+0.096678992 container create 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 1 03:11:01 localhost podman[68063]: 2026-02-01 08:11:00.993296845 +0000 UTC m=+0.029233175 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 1 03:11:01 localhost podman[68073]: 2026-02-01 08:11:01.006113359 +0000 UTC m=+0.032225726 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 1 03:11:01 localhost systemd[1]: Started libpod-conmon-96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.scope. Feb 1 03:11:01 localhost systemd[1]: Started libcrun container. Feb 1 03:11:01 localhost systemd[1]: Started libpod-conmon-93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.scope. Feb 1 03:11:01 localhost systemd[1]: Started libcrun container. Feb 1 03:11:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51d00c3910eefc6cf51faff5d5baa51463b7218ae9021f1d456ee4b881969174/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:01 localhost podman[68097]: 2026-02-01 08:11:01.127454414 +0000 UTC m=+0.125974150 container create 7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_libvirt_init_secret, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public) Feb 1 03:11:01 localhost systemd[1]: Started libcrun container. Feb 1 03:11:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d93858986242c6937d720b37f9e41284a4bd20ca36f5e363c143d874dccc14fb/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:01 localhost podman[68092]: 2026-02-01 08:11:01.038776806 +0000 UTC m=+0.043746154 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:11:01 localhost podman[68073]: 2026-02-01 08:11:01.151199178 +0000 UTC m=+0.177311515 container init 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:11:01 localhost podman[68097]: 2026-02-01 08:11:01.056780391 +0000 UTC m=+0.055300137 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 1 03:11:01 localhost podman[68112]: 2026-02-01 08:11:01.057848616 +0000 UTC m=+0.041745309 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 1 03:11:01 localhost podman[68112]: 2026-02-01 08:11:01.157349101 +0000 UTC m=+0.141245784 container create 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, tcib_managed=true) Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:11:01 localhost podman[68092]: 2026-02-01 08:11:01.16064342 +0000 UTC m=+0.165612798 container init 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-type=git, release=1766032510, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:11:01 localhost podman[68073]: 2026-02-01 08:11:01.18488346 +0000 UTC m=+0.210995787 container start 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, release=1766032510) Feb 1 03:11:01 localhost systemd[1]: Started libpod-conmon-857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.scope. Feb 1 03:11:01 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=56f18c3ee04e8cd5761527c0820290d2 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:11:01 localhost podman[68092]: 2026-02-01 08:11:01.201647734 +0000 UTC m=+0.206617092 container start 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:11:01 localhost systemd[1]: Started libcrun container. Feb 1 03:11:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d44c44bcfe4704b931236c6c12b33c33d5721caf0abe0082648700db793c9ca/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:01 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 1 03:11:01 localhost systemd[1]: Started libpod-conmon-7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b.scope. Feb 1 03:11:01 localhost podman[68063]: 2026-02-01 08:11:01.235934915 +0000 UTC m=+0.271871215 container init d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=configure_cms_options, release=1766032510, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:11:01 localhost systemd[1]: Started libcrun container. Feb 1 03:11:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7bfc414090cc6479f36ddb76b3b2d4149b98ddd1907ff32d4ea19043e0378e9/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7bfc414090cc6479f36ddb76b3b2d4149b98ddd1907ff32d4ea19043e0378e9/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7bfc414090cc6479f36ddb76b3b2d4149b98ddd1907ff32d4ea19043e0378e9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:11:01 localhost podman[68097]: 2026-02-01 08:11:01.26090346 +0000 UTC m=+0.259423196 container init 7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=nova_libvirt_init_secret, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5) Feb 1 03:11:01 localhost podman[68112]: 2026-02-01 08:11:01.26151785 +0000 UTC m=+0.245414553 container init 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510) Feb 1 03:11:01 localhost podman[68097]: 2026-02-01 08:11:01.26968936 +0000 UTC m=+0.268209096 container start 7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_libvirt_init_secret, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public) Feb 1 03:11:01 localhost podman[68097]: 2026-02-01 08:11:01.269861515 +0000 UTC m=+0.268381271 container attach 7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_libvirt_init_secret, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z) Feb 1 03:11:01 localhost podman[68168]: 2026-02-01 08:11:01.270785855 +0000 UTC m=+0.081287124 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, release=1766032510) Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:11:01 localhost podman[68112]: 2026-02-01 08:11:01.282804982 +0000 UTC m=+0.266701665 container start 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 1 03:11:01 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=56f18c3ee04e8cd5761527c0820290d2 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 1 03:11:01 localhost podman[68063]: 2026-02-01 08:11:01.297199348 +0000 UTC m=+0.333135648 container start d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=configure_cms_options, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:11:01 localhost podman[68063]: 2026-02-01 08:11:01.298316494 +0000 UTC m=+0.334252804 container attach d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=configure_cms_options, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=) Feb 1 03:11:01 localhost podman[68177]: 2026-02-01 08:11:01.325044157 +0000 UTC m=+0.119302300 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, architecture=x86_64) Feb 1 03:11:01 localhost podman[68177]: 2026-02-01 08:11:01.331118287 +0000 UTC m=+0.125376440 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, distribution-scope=public, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git) Feb 1 03:11:01 localhost ovs-vsctl[68270]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Feb 1 03:11:01 localhost podman[68168]: 2026-02-01 08:11:01.373861638 +0000 UTC m=+0.184362927 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, build-date=2026-01-12T23:07:30Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:11:01 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:11:01 localhost systemd[1]: libpod-d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a.scope: Deactivated successfully. Feb 1 03:11:01 localhost podman[68063]: 2026-02-01 08:11:01.38725323 +0000 UTC m=+0.423189540 container died d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=configure_cms_options, architecture=x86_64, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z) Feb 1 03:11:01 localhost systemd[1]: libpod-7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b.scope: Deactivated successfully. Feb 1 03:11:01 localhost podman[68097]: 2026-02-01 08:11:01.418183551 +0000 UTC m=+0.416703297 container died 7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, container_name=nova_libvirt_init_secret, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:11:01 localhost podman[68168]: unhealthy Feb 1 03:11:01 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:11:01 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Failed with result 'exit-code'. Feb 1 03:11:01 localhost podman[68303]: 2026-02-01 08:11:01.486776375 +0000 UTC m=+0.062530695 container cleanup 7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_libvirt_init_secret, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt) Feb 1 03:11:01 localhost systemd[1]: libpod-conmon-7d8f3718f079c175a62f575f93481ec62112a8187a039545bd93635b8d03161b.scope: Deactivated successfully. Feb 1 03:11:01 localhost podman[68237]: 2026-02-01 08:11:01.379275097 +0000 UTC m=+0.092682350 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:11:01 localhost podman[68282]: 2026-02-01 08:11:01.541678388 +0000 UTC m=+0.149635791 container cleanup d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z) Feb 1 03:11:01 localhost systemd[1]: libpod-conmon-d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a.scope: Deactivated successfully. Feb 1 03:11:01 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Feb 1 03:11:01 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Feb 1 03:11:01 localhost podman[68237]: 2026-02-01 08:11:01.562421332 +0000 UTC m=+0.275828665 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.13, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:11:01 localhost podman[68237]: unhealthy Feb 1 03:11:01 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:11:01 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Failed with result 'exit-code'. Feb 1 03:11:01 localhost podman[68450]: 2026-02-01 08:11:01.783921404 +0000 UTC m=+0.049267876 container create d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:11:01 localhost systemd[1]: Started libpod-conmon-d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.scope. Feb 1 03:11:01 localhost systemd[1]: Started libcrun container. Feb 1 03:11:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3725e54853595614926889eb99d8b5ab03502eb966cd4ee026013d34265250f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:11:01 localhost podman[68450]: 2026-02-01 08:11:01.759698115 +0000 UTC m=+0.025044597 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:11:01 localhost podman[68450]: 2026-02-01 08:11:01.860541424 +0000 UTC m=+0.125887896 container init d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=nova_migration_target, tcib_managed=true, release=1766032510, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:11:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:11:01 localhost podman[68474]: 2026-02-01 08:11:01.890512903 +0000 UTC m=+0.085910417 container create 8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, release=1766032510, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=setup_ovs_manager, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:11:01 localhost podman[68450]: 2026-02-01 08:11:01.899328305 +0000 UTC m=+0.164674777 container start d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, container_name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:11:01 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:11:01 localhost systemd[1]: Started libpod-conmon-8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569.scope. Feb 1 03:11:01 localhost podman[68474]: 2026-02-01 08:11:01.835812828 +0000 UTC m=+0.031210322 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 1 03:11:01 localhost systemd[1]: Started libcrun container. Feb 1 03:11:01 localhost podman[68474]: 2026-02-01 08:11:01.960039978 +0000 UTC m=+0.155437442 container init 8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=setup_ovs_manager, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:11:01 localhost podman[68474]: 2026-02-01 08:11:01.967280778 +0000 UTC m=+0.162678252 container start 8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-type=git, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 1 03:11:01 localhost podman[68474]: 2026-02-01 08:11:01.967614489 +0000 UTC m=+0.163011963 container attach 8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 1 03:11:01 localhost podman[68496]: 2026-02-01 08:11:01.967786034 +0000 UTC m=+0.070092315 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target) Feb 1 03:11:02 localhost systemd[1]: var-lib-containers-storage-overlay-dd1aa08156f9d09a864094f6714c2c9a2978317562a3f35bb8262e99f62ead42-merged.mount: Deactivated successfully. Feb 1 03:11:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d58d5c29b24f3581b71dc092398aee1b693dd20eb87bd1d7e388ca494c31886a-userdata-shm.mount: Deactivated successfully. Feb 1 03:11:02 localhost podman[68496]: 2026-02-01 08:11:02.256140833 +0000 UTC m=+0.358447114 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc.) Feb 1 03:11:02 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:11:02 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Feb 1 03:11:04 localhost ovs-vsctl[68678]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Feb 1 03:11:05 localhost systemd[1]: libpod-8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569.scope: Deactivated successfully. Feb 1 03:11:05 localhost systemd[1]: libpod-8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569.scope: Consumed 2.974s CPU time. Feb 1 03:11:05 localhost podman[68679]: 2026-02-01 08:11:05.137031767 +0000 UTC m=+0.056012771 container died 8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, container_name=setup_ovs_manager, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:11:05 localhost systemd[1]: tmp-crun.07Yzxi.mount: Deactivated successfully. Feb 1 03:11:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569-userdata-shm.mount: Deactivated successfully. Feb 1 03:11:05 localhost systemd[1]: var-lib-containers-storage-overlay-c775d2ec786da55abbb04060f9ea32c54b4c25bf6c727a9cf5516c9415a2732e-merged.mount: Deactivated successfully. Feb 1 03:11:05 localhost podman[68679]: 2026-02-01 08:11:05.193313036 +0000 UTC m=+0.112294030 container cleanup 8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, container_name=setup_ovs_manager) Feb 1 03:11:05 localhost systemd[1]: libpod-conmon-8b53f0cf543510c55093f5c46dd7a3236ec31b6351062cc417885eb2135ff569.scope: Deactivated successfully. Feb 1 03:11:05 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Feb 1 03:11:05 localhost podman[68790]: 2026-02-01 08:11:05.614532736 +0000 UTC m=+0.037033052 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 1 03:11:05 localhost podman[68790]: 2026-02-01 08:11:05.861910527 +0000 UTC m=+0.284410853 container create f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 1 03:11:05 localhost systemd[1]: Started libpod-conmon-f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.scope. Feb 1 03:11:05 localhost systemd[1]: Started libcrun container. Feb 1 03:11:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/055744583dc2f819841e567440534be33df8561cb3313cfcbdf707f0c0d0f4f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/055744583dc2f819841e567440534be33df8561cb3313cfcbdf707f0c0d0f4f2/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/055744583dc2f819841e567440534be33df8561cb3313cfcbdf707f0c0d0f4f2/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:05 localhost podman[68808]: 2026-02-01 08:11:05.92325958 +0000 UTC m=+0.066329706 container create f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:11:05 localhost systemd[1]: Started libpod-conmon-f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.scope. Feb 1 03:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:11:05 localhost podman[68790]: 2026-02-01 08:11:05.955828395 +0000 UTC m=+0.378328701 container init f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 1 03:11:05 localhost systemd[1]: Started libcrun container. Feb 1 03:11:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a18a8a420711bccaa970c77feff6f4a517526e7e59a2e2c9c151c919c25156/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a18a8a420711bccaa970c77feff6f4a517526e7e59a2e2c9c151c919c25156/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76a18a8a420711bccaa970c77feff6f4a517526e7e59a2e2c9c151c919c25156/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:11:05 localhost podman[68790]: 2026-02-01 08:11:05.980672773 +0000 UTC m=+0.403173079 container start f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:11:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:11:05 localhost podman[68808]: 2026-02-01 08:11:05.985694136 +0000 UTC m=+0.128764242 container init f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:11:05 localhost podman[68808]: 2026-02-01 08:11:05.886960942 +0000 UTC m=+0.030031058 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 1 03:11:05 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=4dacb3799b36b0da29dc6587bf4940e2 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 1 03:11:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:11:06 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:11:06 localhost systemd[1]: Created slice User Slice of UID 0. Feb 1 03:11:06 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 1 03:11:06 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 1 03:11:06 localhost systemd[1]: Starting User Manager for UID 0... Feb 1 03:11:06 localhost podman[68808]: 2026-02-01 08:11:06.087176224 +0000 UTC m=+0.230246310 container start f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller) Feb 1 03:11:06 localhost python3[67906]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 1 03:11:06 localhost podman[68859]: 2026-02-01 08:11:06.110897469 +0000 UTC m=+0.071044370 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, batch=17.1_20260112.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller) Feb 1 03:11:06 localhost podman[68834]: 2026-02-01 08:11:06.077924012 +0000 UTC m=+0.091851545 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:11:06 localhost podman[68834]: 2026-02-01 08:11:06.160307938 +0000 UTC m=+0.174235411 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:11:06 localhost podman[68834]: unhealthy Feb 1 03:11:06 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:11:06 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:11:06 localhost podman[68859]: 2026-02-01 08:11:06.214015487 +0000 UTC m=+0.174162348 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 1 03:11:06 localhost systemd[68876]: Queued start job for default target Main User Target. Feb 1 03:11:06 localhost systemd[68876]: Created slice User Application Slice. Feb 1 03:11:06 localhost systemd[68876]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 1 03:11:06 localhost systemd[68876]: Started Daily Cleanup of User's Temporary Directories. Feb 1 03:11:06 localhost systemd[68876]: Reached target Paths. Feb 1 03:11:06 localhost systemd[68876]: Reached target Timers. Feb 1 03:11:06 localhost systemd[68876]: Starting D-Bus User Message Bus Socket... Feb 1 03:11:06 localhost systemd[68876]: Starting Create User's Volatile Files and Directories... Feb 1 03:11:06 localhost podman[68859]: unhealthy Feb 1 03:11:06 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:11:06 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:11:06 localhost systemd[68876]: Finished Create User's Volatile Files and Directories. Feb 1 03:11:06 localhost systemd[68876]: Listening on D-Bus User Message Bus Socket. Feb 1 03:11:06 localhost systemd[68876]: Reached target Sockets. Feb 1 03:11:06 localhost systemd[68876]: Reached target Basic System. Feb 1 03:11:06 localhost systemd[68876]: Reached target Main User Target. Feb 1 03:11:06 localhost systemd[68876]: Startup finished in 148ms. Feb 1 03:11:06 localhost systemd[1]: Started User Manager for UID 0. Feb 1 03:11:06 localhost systemd[1]: Started Session c9 of User root. Feb 1 03:11:06 localhost systemd[1]: session-c9.scope: Deactivated successfully. Feb 1 03:11:06 localhost kernel: device br-int entered promiscuous mode Feb 1 03:11:06 localhost NetworkManager[5964]: [1769933466.3785] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Feb 1 03:11:06 localhost systemd-udevd[68947]: Network interface NamePolicy= disabled on kernel command line. Feb 1 03:11:06 localhost NetworkManager[5964]: [1769933466.4259] device (genev_sys_6081): carrier: link connected Feb 1 03:11:06 localhost NetworkManager[5964]: [1769933466.4264] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Feb 1 03:11:06 localhost kernel: device genev_sys_6081 entered promiscuous mode Feb 1 03:11:06 localhost python3[68971]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:07 localhost python3[68987]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:07 localhost python3[69003]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:07 localhost python3[69019]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:07 localhost python3[69035]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:11:08 localhost podman[69055]: 2026-02-01 08:11:08.119261892 +0000 UTC m=+0.078984742 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Feb 1 03:11:08 localhost python3[69054]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:08 localhost podman[69055]: 2026-02-01 08:11:08.347473299 +0000 UTC m=+0.307196219 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true) Feb 1 03:11:08 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:11:08 localhost python3[69098]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:11:08 localhost python3[69117]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:11:08 localhost python3[69134]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:11:09 localhost python3[69152]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:11:09 localhost python3[69168]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:11:09 localhost python3[69184]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:11:10 localhost python3[69245]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933469.7822778-109166-77751296848137/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:10 localhost python3[69274]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933469.7822778-109166-77751296848137/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:11 localhost python3[69303]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933469.7822778-109166-77751296848137/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:11 localhost python3[69332]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933469.7822778-109166-77751296848137/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:12 localhost python3[69361]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933469.7822778-109166-77751296848137/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:12 localhost python3[69390]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933469.7822778-109166-77751296848137/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:13 localhost python3[69406]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 03:11:13 localhost systemd[1]: Reloading. Feb 1 03:11:13 localhost systemd-rc-local-generator[69434]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:13 localhost systemd-sysv-generator[69438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:14 localhost python3[69459]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:11:14 localhost systemd[1]: Reloading. Feb 1 03:11:14 localhost systemd-sysv-generator[69493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:14 localhost systemd-rc-local-generator[69489]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:14 localhost systemd[1]: Starting ceilometer_agent_compute container... Feb 1 03:11:14 localhost tripleo-start-podman-container[69500]: Creating additional drop-in dependency for "ceilometer_agent_compute" (857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233) Feb 1 03:11:14 localhost systemd[1]: Reloading. Feb 1 03:11:14 localhost systemd-rc-local-generator[69561]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:14 localhost systemd-sysv-generator[69565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:15 localhost systemd[1]: Started ceilometer_agent_compute container. Feb 1 03:11:15 localhost python3[69585]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:11:15 localhost systemd[1]: Reloading. Feb 1 03:11:16 localhost systemd-rc-local-generator[69615]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:16 localhost systemd-sysv-generator[69618]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:16 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Feb 1 03:11:16 localhost systemd[1]: Stopping User Manager for UID 0... Feb 1 03:11:16 localhost systemd[68876]: Activating special unit Exit the Session... Feb 1 03:11:16 localhost systemd[68876]: Stopped target Main User Target. Feb 1 03:11:16 localhost systemd[68876]: Stopped target Basic System. Feb 1 03:11:16 localhost systemd[68876]: Stopped target Paths. Feb 1 03:11:16 localhost systemd[68876]: Stopped target Sockets. Feb 1 03:11:16 localhost systemd[68876]: Stopped target Timers. Feb 1 03:11:16 localhost systemd[68876]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 03:11:16 localhost systemd[68876]: Closed D-Bus User Message Bus Socket. Feb 1 03:11:16 localhost systemd[68876]: Stopped Create User's Volatile Files and Directories. Feb 1 03:11:16 localhost systemd[68876]: Removed slice User Application Slice. Feb 1 03:11:16 localhost systemd[68876]: Reached target Shutdown. Feb 1 03:11:16 localhost systemd[68876]: Finished Exit the Session. Feb 1 03:11:16 localhost systemd[68876]: Reached target Exit the Session. Feb 1 03:11:16 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 1 03:11:16 localhost systemd[1]: Stopped User Manager for UID 0. Feb 1 03:11:16 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 1 03:11:16 localhost systemd[1]: Started ceilometer_agent_ipmi container. Feb 1 03:11:16 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 1 03:11:16 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 1 03:11:16 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 1 03:11:16 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 1 03:11:16 localhost python3[69652]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:11:17 localhost systemd[1]: Reloading. Feb 1 03:11:17 localhost systemd-sysv-generator[69680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:17 localhost systemd-rc-local-generator[69675]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:17 localhost systemd[1]: Starting logrotate_crond container... Feb 1 03:11:17 localhost systemd[1]: Started logrotate_crond container. Feb 1 03:11:18 localhost python3[69721]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:11:18 localhost systemd[1]: Reloading. Feb 1 03:11:18 localhost systemd-rc-local-generator[69750]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:18 localhost systemd-sysv-generator[69753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:18 localhost systemd[1]: Starting nova_migration_target container... Feb 1 03:11:18 localhost systemd[1]: Started nova_migration_target container. Feb 1 03:11:19 localhost python3[69789]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:11:19 localhost systemd[1]: Reloading. Feb 1 03:11:19 localhost systemd-rc-local-generator[69815]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:19 localhost systemd-sysv-generator[69819]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:19 localhost systemd[1]: Starting ovn_controller container... Feb 1 03:11:19 localhost tripleo-start-podman-container[69829]: Creating additional drop-in dependency for "ovn_controller" (f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446) Feb 1 03:11:19 localhost systemd[1]: Reloading. Feb 1 03:11:19 localhost systemd-rc-local-generator[69883]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:19 localhost systemd-sysv-generator[69886]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:20 localhost systemd[1]: Started ovn_controller container. Feb 1 03:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:11:20 localhost systemd[1]: tmp-crun.LRQdiR.mount: Deactivated successfully. Feb 1 03:11:20 localhost podman[69899]: 2026-02-01 08:11:20.237434897 +0000 UTC m=+0.094365192 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Feb 1 03:11:20 localhost podman[69899]: 2026-02-01 08:11:20.249601128 +0000 UTC m=+0.106531413 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=collectd, distribution-scope=public) Feb 1 03:11:20 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:11:20 localhost podman[69900]: 2026-02-01 08:11:20.316838741 +0000 UTC m=+0.174417166 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:11:20 localhost podman[69900]: 2026-02-01 08:11:20.35544676 +0000 UTC m=+0.213025215 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container) Feb 1 03:11:20 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:11:20 localhost python3[69953]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:11:20 localhost systemd[1]: Reloading. Feb 1 03:11:21 localhost systemd-rc-local-generator[69982]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:11:21 localhost systemd-sysv-generator[69988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:11:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:11:21 localhost systemd[1]: Starting ovn_metadata_agent container... Feb 1 03:11:21 localhost systemd[1]: Started ovn_metadata_agent container. Feb 1 03:11:21 localhost python3[70037]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:23 localhost python3[70158]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005604212 step=4 update_config_hash_only=False Feb 1 03:11:23 localhost python3[70174]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:11:24 localhost python3[70190]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 1 03:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:11:31 localhost podman[70194]: 2026-02-01 08:11:31.734463967 +0000 UTC m=+0.083638675 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, release=1766032510) Feb 1 03:11:31 localhost systemd[1]: tmp-crun.KIJEjw.mount: Deactivated successfully. Feb 1 03:11:31 localhost podman[70193]: 2026-02-01 08:11:31.788447885 +0000 UTC m=+0.139121259 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 1 03:11:31 localhost podman[70192]: 2026-02-01 08:11:31.840257836 +0000 UTC m=+0.193616982 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, release=1766032510, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond) Feb 1 03:11:31 localhost podman[70192]: 2026-02-01 08:11:31.847371103 +0000 UTC m=+0.200730259 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, tcib_managed=true, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:11:31 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:11:31 localhost podman[70194]: 2026-02-01 08:11:31.892304115 +0000 UTC m=+0.241478843 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team) Feb 1 03:11:31 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:11:31 localhost podman[70193]: 2026-02-01 08:11:31.952574836 +0000 UTC m=+0.303248230 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:11:31 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:11:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:11:32 localhost podman[70264]: 2026-02-01 08:11:32.728589736 +0000 UTC m=+0.085753949 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:11:33 localhost podman[70264]: 2026-02-01 08:11:33.111344871 +0000 UTC m=+0.468509004 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1) Feb 1 03:11:33 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:11:36 localhost snmpd[66800]: empty variable list in _query Feb 1 03:11:36 localhost snmpd[66800]: empty variable list in _query Feb 1 03:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:11:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:11:36 localhost podman[70389]: 2026-02-01 08:11:36.729353766 +0000 UTC m=+0.086158872 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:11:36 localhost systemd[1]: tmp-crun.bWM6pO.mount: Deactivated successfully. Feb 1 03:11:36 localhost podman[70392]: 2026-02-01 08:11:36.786415117 +0000 UTC m=+0.138177779 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public) Feb 1 03:11:36 localhost podman[70389]: 2026-02-01 08:11:36.816777594 +0000 UTC m=+0.173582730 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:11:36 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:11:36 localhost podman[70392]: 2026-02-01 08:11:36.838488177 +0000 UTC m=+0.190250819 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.13, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:11:36 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:11:37 localhost podman[70518]: Feb 1 03:11:37 localhost podman[70518]: 2026-02-01 08:11:37.726628111 +0000 UTC m=+0.083243782 container create bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_antonelli, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, ceph=True, release=1764794109, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 03:11:37 localhost systemd[1]: Started libpod-conmon-bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e.scope. Feb 1 03:11:37 localhost podman[70518]: 2026-02-01 08:11:37.693622204 +0000 UTC m=+0.050237925 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 03:11:37 localhost systemd[1]: Started libcrun container. Feb 1 03:11:37 localhost podman[70518]: 2026-02-01 08:11:37.836442524 +0000 UTC m=+0.193058185 container init bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_antonelli, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, version=7, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public, GIT_CLEAN=True, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 03:11:37 localhost podman[70518]: 2026-02-01 08:11:37.848610215 +0000 UTC m=+0.205225846 container start bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_antonelli, release=1764794109, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , version=7) Feb 1 03:11:37 localhost podman[70518]: 2026-02-01 08:11:37.848856422 +0000 UTC m=+0.205472133 container attach bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_antonelli, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1764794109, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7) Feb 1 03:11:37 localhost peaceful_antonelli[70533]: 167 167 Feb 1 03:11:37 localhost systemd[1]: libpod-bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e.scope: Deactivated successfully. Feb 1 03:11:37 localhost podman[70538]: 2026-02-01 08:11:37.902897993 +0000 UTC m=+0.038263540 container died bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_antonelli, vendor=Red Hat, Inc., ceph=True, name=rhceph, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Feb 1 03:11:37 localhost podman[70538]: 2026-02-01 08:11:37.932726193 +0000 UTC m=+0.068091720 container remove bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_antonelli, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, version=7, build-date=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.expose-services=, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4) Feb 1 03:11:37 localhost systemd[1]: libpod-conmon-bca03cf0669487029905c59762f53ac00abb96f39720318861a318c021bcea9e.scope: Deactivated successfully. Feb 1 03:11:38 localhost podman[70559]: Feb 1 03:11:38 localhost podman[70559]: 2026-02-01 08:11:38.174073651 +0000 UTC m=+0.089281886 container create 0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_einstein, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, version=7, release=1764794109, name=rhceph, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main) Feb 1 03:11:38 localhost systemd[1]: Started libpod-conmon-0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c.scope. Feb 1 03:11:38 localhost systemd[1]: Started libcrun container. Feb 1 03:11:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c80d3ba3f5312f7cf2e915da86e074e456fed22272a2b51df129587b38ec16/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c80d3ba3f5312f7cf2e915da86e074e456fed22272a2b51df129587b38ec16/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4c80d3ba3f5312f7cf2e915da86e074e456fed22272a2b51df129587b38ec16/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 03:11:38 localhost podman[70559]: 2026-02-01 08:11:38.142089825 +0000 UTC m=+0.057298060 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 03:11:38 localhost podman[70559]: 2026-02-01 08:11:38.244023366 +0000 UTC m=+0.159231571 container init 0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_einstein, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, architecture=x86_64, GIT_CLEAN=True, release=1764794109, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:11:38 localhost podman[70559]: 2026-02-01 08:11:38.255715374 +0000 UTC m=+0.170923579 container start 0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_einstein, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7) Feb 1 03:11:38 localhost podman[70559]: 2026-02-01 08:11:38.256094166 +0000 UTC m=+0.171302421 container attach 0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_einstein, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, build-date=2025-12-08T17:28:53Z, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=) Feb 1 03:11:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:11:38 localhost systemd[1]: var-lib-containers-storage-overlay-baaf6f5625496abcf666811dd17053707cc6d9135a943092eaf057b4b3d034a3-merged.mount: Deactivated successfully. Feb 1 03:11:38 localhost podman[70590]: 2026-02-01 08:11:38.74168471 +0000 UTC m=+0.092849626 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:11:38 localhost podman[70590]: 2026-02-01 08:11:38.960614763 +0000 UTC m=+0.311779669 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64) Feb 1 03:11:38 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:11:39 localhost funny_einstein[70576]: [ Feb 1 03:11:39 localhost funny_einstein[70576]: { Feb 1 03:11:39 localhost funny_einstein[70576]: "available": false, Feb 1 03:11:39 localhost funny_einstein[70576]: "ceph_device": false, Feb 1 03:11:39 localhost funny_einstein[70576]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 1 03:11:39 localhost funny_einstein[70576]: "lsm_data": {}, Feb 1 03:11:39 localhost funny_einstein[70576]: "lvs": [], Feb 1 03:11:39 localhost funny_einstein[70576]: "path": "/dev/sr0", Feb 1 03:11:39 localhost funny_einstein[70576]: "rejected_reasons": [ Feb 1 03:11:39 localhost funny_einstein[70576]: "Has a FileSystem", Feb 1 03:11:39 localhost funny_einstein[70576]: "Insufficient space (<5GB)" Feb 1 03:11:39 localhost funny_einstein[70576]: ], Feb 1 03:11:39 localhost funny_einstein[70576]: "sys_api": { Feb 1 03:11:39 localhost funny_einstein[70576]: "actuators": null, Feb 1 03:11:39 localhost funny_einstein[70576]: "device_nodes": "sr0", Feb 1 03:11:39 localhost funny_einstein[70576]: "human_readable_size": "482.00 KB", Feb 1 03:11:39 localhost funny_einstein[70576]: "id_bus": "ata", Feb 1 03:11:39 localhost funny_einstein[70576]: "model": "QEMU DVD-ROM", Feb 1 03:11:39 localhost funny_einstein[70576]: "nr_requests": "2", Feb 1 03:11:39 localhost funny_einstein[70576]: "partitions": {}, Feb 1 03:11:39 localhost funny_einstein[70576]: "path": "/dev/sr0", Feb 1 03:11:39 localhost funny_einstein[70576]: "removable": "1", Feb 1 03:11:39 localhost funny_einstein[70576]: "rev": "2.5+", Feb 1 03:11:39 localhost funny_einstein[70576]: "ro": "0", Feb 1 03:11:39 localhost funny_einstein[70576]: "rotational": "1", Feb 1 03:11:39 localhost funny_einstein[70576]: "sas_address": "", Feb 1 03:11:39 localhost funny_einstein[70576]: "sas_device_handle": "", Feb 1 03:11:39 localhost funny_einstein[70576]: "scheduler_mode": "mq-deadline", Feb 1 03:11:39 localhost funny_einstein[70576]: "sectors": 0, Feb 1 03:11:39 localhost funny_einstein[70576]: "sectorsize": "2048", Feb 1 03:11:39 localhost funny_einstein[70576]: "size": 493568.0, Feb 1 03:11:39 localhost funny_einstein[70576]: "support_discard": "0", Feb 1 03:11:39 localhost funny_einstein[70576]: "type": "disk", Feb 1 03:11:39 localhost funny_einstein[70576]: "vendor": "QEMU" Feb 1 03:11:39 localhost funny_einstein[70576]: } Feb 1 03:11:39 localhost funny_einstein[70576]: } Feb 1 03:11:39 localhost funny_einstein[70576]: ] Feb 1 03:11:39 localhost systemd[1]: libpod-0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c.scope: Deactivated successfully. Feb 1 03:11:39 localhost podman[70559]: 2026-02-01 08:11:39.236196177 +0000 UTC m=+1.151404372 container died 0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_einstein, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, ceph=True, name=rhceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 03:11:39 localhost systemd[1]: var-lib-containers-storage-overlay-a4c80d3ba3f5312f7cf2e915da86e074e456fed22272a2b51df129587b38ec16-merged.mount: Deactivated successfully. Feb 1 03:11:39 localhost podman[72622]: 2026-02-01 08:11:39.327306378 +0000 UTC m=+0.080572951 container remove 0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_einstein, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, release=1764794109, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public) Feb 1 03:11:39 localhost systemd[1]: libpod-conmon-0a92d1987e5251d3f62350aee5165c3d2f08ce299c0d5952815f447fb6b3f77c.scope: Deactivated successfully. Feb 1 03:11:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:11:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:11:50 localhost podman[72651]: 2026-02-01 08:11:50.718627123 +0000 UTC m=+0.076240308 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.buildah.version=1.41.5, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 1 03:11:50 localhost podman[72651]: 2026-02-01 08:11:50.729591578 +0000 UTC m=+0.087204753 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z) Feb 1 03:11:50 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:11:50 localhost podman[72650]: 2026-02-01 08:11:50.821933747 +0000 UTC m=+0.179169861 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:11:50 localhost podman[72650]: 2026-02-01 08:11:50.833657015 +0000 UTC m=+0.190893099 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, container_name=collectd, batch=17.1_20260112.1, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, vcs-type=git) Feb 1 03:11:50 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:11:58 localhost sshd[72688]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:12:02 localhost podman[72692]: 2026-02-01 08:12:02.70611239 +0000 UTC m=+0.062482119 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 1 03:12:02 localhost podman[72692]: 2026-02-01 08:12:02.735104885 +0000 UTC m=+0.091474624 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:12:02 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:12:02 localhost podman[72691]: 2026-02-01 08:12:02.828068183 +0000 UTC m=+0.185875055 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-cron-container, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc.) Feb 1 03:12:02 localhost podman[72691]: 2026-02-01 08:12:02.86565174 +0000 UTC m=+0.223458602 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:12:02 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:12:02 localhost podman[72690]: 2026-02-01 08:12:02.884847536 +0000 UTC m=+0.245013800 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, container_name=ceilometer_agent_compute) Feb 1 03:12:02 localhost podman[72690]: 2026-02-01 08:12:02.941967731 +0000 UTC m=+0.302133985 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:12:02 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:12:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:12:03 localhost podman[72759]: 2026-02-01 08:12:03.718565039 +0000 UTC m=+0.078035883 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_migration_target, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Feb 1 03:12:04 localhost podman[72759]: 2026-02-01 08:12:04.10053656 +0000 UTC m=+0.460007474 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, container_name=nova_migration_target, batch=17.1_20260112.1) Feb 1 03:12:04 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:12:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:12:07 localhost podman[72782]: 2026-02-01 08:12:07.727941551 +0000 UTC m=+0.084796459 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, batch=17.1_20260112.1) Feb 1 03:12:07 localhost podman[72783]: 2026-02-01 08:12:07.780810265 +0000 UTC m=+0.135036443 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1) Feb 1 03:12:07 localhost podman[72782]: 2026-02-01 08:12:07.794426191 +0000 UTC m=+0.151281119 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:12:07 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:12:07 localhost podman[72783]: 2026-02-01 08:12:07.828976446 +0000 UTC m=+0.183202614 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:12:07 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:12:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:12:09 localhost systemd[1]: tmp-crun.qTTefu.mount: Deactivated successfully. Feb 1 03:12:09 localhost podman[72830]: 2026-02-01 08:12:09.737417927 +0000 UTC m=+0.098372354 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step1, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 1 03:12:09 localhost podman[72830]: 2026-02-01 08:12:09.936435213 +0000 UTC m=+0.297389580 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team) Feb 1 03:12:09 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:12:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:12:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:12:21 localhost podman[72860]: 2026-02-01 08:12:21.735194596 +0000 UTC m=+0.089028369 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:12:21 localhost podman[72860]: 2026-02-01 08:12:21.774930719 +0000 UTC m=+0.128764452 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible) Feb 1 03:12:21 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:12:21 localhost podman[72859]: 2026-02-01 08:12:21.778572861 +0000 UTC m=+0.136357834 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container) Feb 1 03:12:21 localhost podman[72859]: 2026-02-01 08:12:21.864526584 +0000 UTC m=+0.222311527 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13) Feb 1 03:12:21 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:12:33 localhost podman[72900]: 2026-02-01 08:12:33.740365371 +0000 UTC m=+0.092147014 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:12:33 localhost podman[72899]: 2026-02-01 08:12:33.712402568 +0000 UTC m=+0.069991278 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, vcs-type=git, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:12:33 localhost podman[72900]: 2026-02-01 08:12:33.768472589 +0000 UTC m=+0.120254242 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:12:33 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:12:33 localhost podman[72899]: 2026-02-01 08:12:33.793012318 +0000 UTC m=+0.150600958 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 1 03:12:33 localhost podman[72898]: 2026-02-01 08:12:33.830697229 +0000 UTC m=+0.190987461 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, release=1766032510, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:12:33 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:12:33 localhost podman[72898]: 2026-02-01 08:12:33.881643364 +0000 UTC m=+0.241933676 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc.) Feb 1 03:12:33 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:12:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:12:34 localhost podman[72967]: 2026-02-01 08:12:34.741411632 +0000 UTC m=+0.073668800 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, batch=17.1_20260112.1, managed_by=tripleo_ansible) Feb 1 03:12:35 localhost podman[72967]: 2026-02-01 08:12:35.111036366 +0000 UTC m=+0.443293484 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, container_name=nova_migration_target, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:12:35 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:12:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:12:38 localhost systemd[1]: tmp-crun.uBnPJ5.mount: Deactivated successfully. Feb 1 03:12:38 localhost podman[72991]: 2026-02-01 08:12:38.726517592 +0000 UTC m=+0.084891122 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 1 03:12:38 localhost podman[72992]: 2026-02-01 08:12:38.781906654 +0000 UTC m=+0.137430057 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, batch=17.1_20260112.1, container_name=ovn_controller, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible) Feb 1 03:12:38 localhost podman[72991]: 2026-02-01 08:12:38.787631848 +0000 UTC m=+0.146005428 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 1 03:12:38 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:12:38 localhost podman[72992]: 2026-02-01 08:12:38.845283909 +0000 UTC m=+0.200807332 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1766032510, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=) Feb 1 03:12:38 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:12:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:12:40 localhost podman[73054]: 2026-02-01 08:12:40.225832146 +0000 UTC m=+0.083924134 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, config_id=tripleo_step1) Feb 1 03:12:40 localhost podman[73054]: 2026-02-01 08:12:40.400856308 +0000 UTC m=+0.258948256 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step1, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:12:40 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:12:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:12:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:12:52 localhost podman[73145]: 2026-02-01 08:12:52.7314705 +0000 UTC m=+0.088684909 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:12:52 localhost podman[73145]: 2026-02-01 08:12:52.779360302 +0000 UTC m=+0.136574661 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5) Feb 1 03:12:52 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:12:52 localhost podman[73144]: 2026-02-01 08:12:52.783332592 +0000 UTC m=+0.143989086 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=) Feb 1 03:12:52 localhost podman[73144]: 2026-02-01 08:12:52.869607066 +0000 UTC m=+0.230263540 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:12:52 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:13:04 localhost systemd[1]: tmp-crun.xGvEyr.mount: Deactivated successfully. Feb 1 03:13:04 localhost podman[73182]: 2026-02-01 08:13:04.712704614 +0000 UTC m=+0.066148021 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 1 03:13:04 localhost podman[73182]: 2026-02-01 08:13:04.71846478 +0000 UTC m=+0.071908257 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, distribution-scope=public, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 1 03:13:04 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:13:04 localhost podman[73183]: 2026-02-01 08:13:04.780490813 +0000 UTC m=+0.130163265 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, architecture=x86_64, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:13:04 localhost podman[73181]: 2026-02-01 08:13:04.827025374 +0000 UTC m=+0.182566275 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:13:04 localhost podman[73183]: 2026-02-01 08:13:04.839205026 +0000 UTC m=+0.188877448 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, architecture=x86_64, build-date=2026-01-12T23:07:30Z, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:13:04 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:13:04 localhost podman[73181]: 2026-02-01 08:13:04.882365273 +0000 UTC m=+0.237906194 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:13:04 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:13:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:13:05 localhost systemd[1]: tmp-crun.2rHLTz.mount: Deactivated successfully. Feb 1 03:13:05 localhost podman[73254]: 2026-02-01 08:13:05.72733528 +0000 UTC m=+0.087862934 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-type=git, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64) Feb 1 03:13:06 localhost podman[73254]: 2026-02-01 08:13:06.082459961 +0000 UTC m=+0.442987575 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:13:06 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:13:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:13:09 localhost podman[73279]: 2026-02-01 08:13:09.726021725 +0000 UTC m=+0.082507380 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64) Feb 1 03:13:09 localhost podman[73279]: 2026-02-01 08:13:09.770602946 +0000 UTC m=+0.127088551 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:13:09 localhost systemd[1]: tmp-crun.NM6YN7.mount: Deactivated successfully. Feb 1 03:13:09 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:13:09 localhost podman[73278]: 2026-02-01 08:13:09.785117169 +0000 UTC m=+0.141606394 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent) Feb 1 03:13:09 localhost podman[73278]: 2026-02-01 08:13:09.859560202 +0000 UTC m=+0.216049437 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:13:09 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:13:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:13:10 localhost podman[73326]: 2026-02-01 08:13:10.726826788 +0000 UTC m=+0.086562213 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:13:10 localhost podman[73326]: 2026-02-01 08:13:10.957472889 +0000 UTC m=+0.317208284 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.13, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, release=1766032510) Feb 1 03:13:10 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:13:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:13:23 localhost recover_tripleo_nova_virtqemud[73367]: 61284 Feb 1 03:13:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:13:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:13:23 localhost podman[73356]: 2026-02-01 08:13:23.743755352 +0000 UTC m=+0.095670112 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, version=17.1.13, architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:13:23 localhost systemd[1]: tmp-crun.5WP7sf.mount: Deactivated successfully. Feb 1 03:13:23 localhost podman[73357]: 2026-02-01 08:13:23.790923702 +0000 UTC m=+0.139136219 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5) Feb 1 03:13:23 localhost podman[73357]: 2026-02-01 08:13:23.805486187 +0000 UTC m=+0.153698764 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, tcib_managed=true, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid) Feb 1 03:13:23 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:13:23 localhost podman[73356]: 2026-02-01 08:13:23.861232559 +0000 UTC m=+0.213147279 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, release=1766032510) Feb 1 03:13:23 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:13:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:13:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:13:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:13:35 localhost podman[73399]: 2026-02-01 08:13:35.73589368 +0000 UTC m=+0.094063023 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 1 03:13:35 localhost podman[73399]: 2026-02-01 08:13:35.768730442 +0000 UTC m=+0.126899795 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1) Feb 1 03:13:35 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:13:35 localhost podman[73400]: 2026-02-01 08:13:35.789037952 +0000 UTC m=+0.144397959 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510) Feb 1 03:13:35 localhost podman[73400]: 2026-02-01 08:13:35.823919767 +0000 UTC m=+0.179279794 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, config_id=tripleo_step4) Feb 1 03:13:35 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:13:35 localhost podman[73401]: 2026-02-01 08:13:35.848378334 +0000 UTC m=+0.199481861 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:13:35 localhost podman[73401]: 2026-02-01 08:13:35.878886775 +0000 UTC m=+0.229990332 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:13:35 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:13:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:13:36 localhost podman[73471]: 2026-02-01 08:13:36.712948538 +0000 UTC m=+0.076711113 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, release=1766032510, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team) Feb 1 03:13:36 localhost systemd[1]: tmp-crun.2k741C.mount: Deactivated successfully. Feb 1 03:13:37 localhost podman[73471]: 2026-02-01 08:13:37.088299257 +0000 UTC m=+0.452061832 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com) Feb 1 03:13:37 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:13:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:13:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:13:40 localhost podman[73495]: 2026-02-01 08:13:40.737562454 +0000 UTC m=+0.095662741 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:13:40 localhost systemd[1]: tmp-crun.khDGTs.mount: Deactivated successfully. Feb 1 03:13:40 localhost podman[73496]: 2026-02-01 08:13:40.790310575 +0000 UTC m=+0.142735419 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:13:40 localhost podman[73495]: 2026-02-01 08:13:40.812803591 +0000 UTC m=+0.170903858 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5) Feb 1 03:13:40 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:13:40 localhost podman[73496]: 2026-02-01 08:13:40.84254025 +0000 UTC m=+0.194965104 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, distribution-scope=public, container_name=ovn_controller, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:13:40 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:13:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:13:41 localhost podman[73545]: 2026-02-01 08:13:41.718588524 +0000 UTC m=+0.080667463 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:13:41 localhost podman[73545]: 2026-02-01 08:13:41.916188427 +0000 UTC m=+0.278267326 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1) Feb 1 03:13:41 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:13:43 localhost podman[73676]: 2026-02-01 08:13:43.494982816 +0000 UTC m=+0.069487623 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1764794109, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 03:13:43 localhost podman[73676]: 2026-02-01 08:13:43.585848569 +0000 UTC m=+0.160353366 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1764794109, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Feb 1 03:13:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:13:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:13:54 localhost podman[73822]: 2026-02-01 08:13:54.746134661 +0000 UTC m=+0.093641403 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid) Feb 1 03:13:54 localhost podman[73821]: 2026-02-01 08:13:54.79306081 +0000 UTC m=+0.141273073 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:13:54 localhost podman[73821]: 2026-02-01 08:13:54.80631517 +0000 UTC m=+0.154527423 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, container_name=collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Feb 1 03:13:54 localhost podman[73822]: 2026-02-01 08:13:54.813416915 +0000 UTC m=+0.160923687 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:13:54 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:13:54 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:13:58 localhost python3[73908]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:13:58 localhost python3[73953]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933637.849954-112802-223274332008152/source _original_basename=tmp_5slvc61 follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:13:59 localhost python3[73983]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:14:01 localhost ansible-async_wrapper.py[74155]: Invoked with 424317264613 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933640.6032262-112934-16713074881262/AnsiballZ_command.py _ Feb 1 03:14:01 localhost ansible-async_wrapper.py[74158]: Starting module and watcher Feb 1 03:14:01 localhost ansible-async_wrapper.py[74158]: Start watching 74159 (3600) Feb 1 03:14:01 localhost ansible-async_wrapper.py[74159]: Start module (74159) Feb 1 03:14:01 localhost ansible-async_wrapper.py[74155]: Return async_wrapper task started. Feb 1 03:14:01 localhost python3[74179]: ansible-ansible.legacy.async_status Invoked with jid=424317264613.74155 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:14:04 localhost puppet-user[74163]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 1 03:14:04 localhost puppet-user[74163]: (file: /etc/puppet/hiera.yaml) Feb 1 03:14:04 localhost puppet-user[74163]: Warning: Undefined variable '::deploy_config_name'; Feb 1 03:14:04 localhost puppet-user[74163]: (file & line not available) Feb 1 03:14:04 localhost puppet-user[74163]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 1 03:14:04 localhost puppet-user[74163]: (file & line not available) Feb 1 03:14:04 localhost puppet-user[74163]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 1 03:14:05 localhost puppet-user[74163]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:14:05 localhost puppet-user[74163]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:14:05 localhost puppet-user[74163]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:14:05 localhost puppet-user[74163]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:14:05 localhost puppet-user[74163]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:14:05 localhost puppet-user[74163]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:14:05 localhost puppet-user[74163]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:14:05 localhost puppet-user[74163]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:14:05 localhost puppet-user[74163]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:14:05 localhost puppet-user[74163]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:14:05 localhost puppet-user[74163]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:14:05 localhost puppet-user[74163]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:14:05 localhost puppet-user[74163]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:14:05 localhost puppet-user[74163]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:14:05 localhost puppet-user[74163]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 1 03:14:05 localhost puppet-user[74163]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 1 03:14:05 localhost puppet-user[74163]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 1 03:14:05 localhost puppet-user[74163]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 1 03:14:05 localhost puppet-user[74163]: Notice: Compiled catalog for np0005604212.localdomain in environment production in 0.21 seconds Feb 1 03:14:05 localhost puppet-user[74163]: Notice: Applied catalog in 0.33 seconds Feb 1 03:14:05 localhost puppet-user[74163]: Application: Feb 1 03:14:05 localhost puppet-user[74163]: Initial environment: production Feb 1 03:14:05 localhost puppet-user[74163]: Converged environment: production Feb 1 03:14:05 localhost puppet-user[74163]: Run mode: user Feb 1 03:14:05 localhost puppet-user[74163]: Changes: Feb 1 03:14:05 localhost puppet-user[74163]: Events: Feb 1 03:14:05 localhost puppet-user[74163]: Resources: Feb 1 03:14:05 localhost puppet-user[74163]: Total: 19 Feb 1 03:14:05 localhost puppet-user[74163]: Time: Feb 1 03:14:05 localhost puppet-user[74163]: Filebucket: 0.00 Feb 1 03:14:05 localhost puppet-user[74163]: Package: 0.00 Feb 1 03:14:05 localhost puppet-user[74163]: Schedule: 0.00 Feb 1 03:14:05 localhost puppet-user[74163]: Exec: 0.01 Feb 1 03:14:05 localhost puppet-user[74163]: Augeas: 0.01 Feb 1 03:14:05 localhost puppet-user[74163]: File: 0.02 Feb 1 03:14:05 localhost puppet-user[74163]: Service: 0.08 Feb 1 03:14:05 localhost puppet-user[74163]: Config retrieval: 0.27 Feb 1 03:14:05 localhost puppet-user[74163]: Transaction evaluation: 0.32 Feb 1 03:14:05 localhost puppet-user[74163]: Catalog application: 0.33 Feb 1 03:14:05 localhost puppet-user[74163]: Last run: 1769933645 Feb 1 03:14:05 localhost puppet-user[74163]: Total: 0.34 Feb 1 03:14:05 localhost puppet-user[74163]: Version: Feb 1 03:14:05 localhost puppet-user[74163]: Config: 1769933644 Feb 1 03:14:05 localhost puppet-user[74163]: Puppet: 7.10.0 Feb 1 03:14:05 localhost ansible-async_wrapper.py[74159]: Module complete (74159) Feb 1 03:14:06 localhost ansible-async_wrapper.py[74158]: Done in kid B. Feb 1 03:14:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:14:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:14:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:14:06 localhost podman[74304]: 2026-02-01 08:14:06.385507519 +0000 UTC m=+0.090168977 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 1 03:14:06 localhost podman[74303]: 2026-02-01 08:14:06.428327033 +0000 UTC m=+0.133365773 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 1 03:14:06 localhost podman[74303]: 2026-02-01 08:14:06.440490651 +0000 UTC m=+0.145529441 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vcs-type=git, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 1 03:14:06 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:14:06 localhost podman[74304]: 2026-02-01 08:14:06.494733421 +0000 UTC m=+0.199394869 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13) Feb 1 03:14:06 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:14:06 localhost podman[74302]: 2026-02-01 08:14:06.495670399 +0000 UTC m=+0.199394909 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4) Feb 1 03:14:06 localhost podman[74302]: 2026-02-01 08:14:06.579580736 +0000 UTC m=+0.283305296 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, build-date=2026-01-12T23:07:47Z, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 1 03:14:06 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:14:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:14:07 localhost podman[74375]: 2026-02-01 08:14:07.728846291 +0000 UTC m=+0.087972710 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, container_name=nova_migration_target, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:14:08 localhost podman[74375]: 2026-02-01 08:14:08.076314006 +0000 UTC m=+0.435440345 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:14:08 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:14:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:14:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:14:11 localhost systemd[1]: tmp-crun.DUJdDA.mount: Deactivated successfully. Feb 1 03:14:11 localhost podman[74398]: 2026-02-01 08:14:11.727371658 +0000 UTC m=+0.080642019 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:14:11 localhost podman[74397]: 2026-02-01 08:14:11.704105864 +0000 UTC m=+0.065862822 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:14:11 localhost podman[74397]: 2026-02-01 08:14:11.790357992 +0000 UTC m=+0.152114880 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, architecture=x86_64, release=1766032510, version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Feb 1 03:14:11 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:14:11 localhost podman[74398]: 2026-02-01 08:14:11.808696947 +0000 UTC m=+0.161967308 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, container_name=ovn_controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:14:11 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:14:11 localhost python3[74460]: ansible-ansible.legacy.async_status Invoked with jid=424317264613.74155 mode=status _async_dir=/tmp/.ansible_async Feb 1 03:14:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:14:12 localhost podman[74477]: 2026-02-01 08:14:12.564136345 +0000 UTC m=+0.099427937 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible) Feb 1 03:14:12 localhost python3[74476]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:14:12 localhost podman[74477]: 2026-02-01 08:14:12.773666319 +0000 UTC m=+0.308957911 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:14:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:14:12 localhost python3[74521]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:14:13 localhost python3[74571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:14:13 localhost python3[74589]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpeqazx5mq recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 1 03:14:14 localhost python3[74619]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:15 localhost python3[74724]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 1 03:14:16 localhost python3[74743]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:17 localhost python3[74775]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:14:17 localhost python3[74825]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:14:17 localhost python3[74843]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:18 localhost python3[74905]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:14:18 localhost python3[74923]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:19 localhost python3[74985]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:14:19 localhost python3[75003]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:20 localhost python3[75065]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:14:20 localhost python3[75083]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:20 localhost python3[75113]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:14:20 localhost systemd[1]: Reloading. Feb 1 03:14:21 localhost systemd-rc-local-generator[75134]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:14:21 localhost systemd-sysv-generator[75137]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:14:21 localhost sshd[75147]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:14:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:14:21 localhost python3[75199]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:14:22 localhost python3[75217]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:22 localhost python3[75279]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 1 03:14:22 localhost python3[75297]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:14:23 localhost python3[75327]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:14:23 localhost systemd[1]: Reloading. Feb 1 03:14:23 localhost systemd-sysv-generator[75353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:14:23 localhost systemd-rc-local-generator[75350]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:14:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:14:23 localhost systemd[1]: Starting Create netns directory... Feb 1 03:14:23 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 03:14:23 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 03:14:23 localhost systemd[1]: Finished Create netns directory. Feb 1 03:14:24 localhost python3[75384]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 1 03:14:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:14:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:14:25 localhost podman[75428]: 2026-02-01 08:14:25.734674843 +0000 UTC m=+0.084737644 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd) Feb 1 03:14:25 localhost podman[75428]: 2026-02-01 08:14:25.748317955 +0000 UTC m=+0.098380766 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:14:25 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:14:25 localhost podman[75429]: 2026-02-01 08:14:25.843586526 +0000 UTC m=+0.191725538 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:14:25 localhost podman[75429]: 2026-02-01 08:14:25.860487927 +0000 UTC m=+0.208626929 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:14:25 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:14:26 localhost python3[75482]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 1 03:14:26 localhost podman[75519]: 2026-02-01 08:14:26.701464831 +0000 UTC m=+0.089917289 container create 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=) Feb 1 03:14:26 localhost systemd[1]: Started libpod-conmon-5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.scope. Feb 1 03:14:26 localhost podman[75519]: 2026-02-01 08:14:26.656539213 +0000 UTC m=+0.044991741 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:14:26 localhost systemd[1]: Started libcrun container. Feb 1 03:14:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44357a368e781a4d23f5b66e408c1324409c46bcb32da06b3ba5ae9fe4a403b2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 1 03:14:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44357a368e781a4d23f5b66e408c1324409c46bcb32da06b3ba5ae9fe4a403b2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 03:14:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44357a368e781a4d23f5b66e408c1324409c46bcb32da06b3ba5ae9fe4a403b2/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:14:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44357a368e781a4d23f5b66e408c1324409c46bcb32da06b3ba5ae9fe4a403b2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:14:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44357a368e781a4d23f5b66e408c1324409c46bcb32da06b3ba5ae9fe4a403b2/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:14:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:14:26 localhost podman[75519]: 2026-02-01 08:14:26.794154313 +0000 UTC m=+0.182606751 container init 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, container_name=nova_compute) Feb 1 03:14:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:14:26 localhost podman[75519]: 2026-02-01 08:14:26.824332756 +0000 UTC m=+0.212785174 container start 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, architecture=x86_64, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute) Feb 1 03:14:26 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:14:26 localhost python3[75482]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:14:26 localhost systemd[1]: Created slice User Slice of UID 0. Feb 1 03:14:26 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 1 03:14:26 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 1 03:14:26 localhost systemd[1]: Starting User Manager for UID 0... Feb 1 03:14:26 localhost podman[75541]: 2026-02-01 08:14:26.93065437 +0000 UTC m=+0.095491828 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:14:26 localhost podman[75541]: 2026-02-01 08:14:26.990824729 +0000 UTC m=+0.155662197 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vendor=Red Hat, Inc., architecture=x86_64) Feb 1 03:14:27 localhost podman[75541]: unhealthy Feb 1 03:14:27 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:14:27 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:14:27 localhost systemd[75555]: Queued start job for default target Main User Target. Feb 1 03:14:27 localhost systemd[75555]: Created slice User Application Slice. Feb 1 03:14:27 localhost systemd[75555]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 1 03:14:27 localhost systemd[75555]: Started Daily Cleanup of User's Temporary Directories. Feb 1 03:14:27 localhost systemd[75555]: Reached target Paths. Feb 1 03:14:27 localhost systemd[75555]: Reached target Timers. Feb 1 03:14:27 localhost systemd[75555]: Starting D-Bus User Message Bus Socket... Feb 1 03:14:27 localhost systemd[75555]: Starting Create User's Volatile Files and Directories... Feb 1 03:14:27 localhost systemd[75555]: Finished Create User's Volatile Files and Directories. Feb 1 03:14:27 localhost systemd[75555]: Listening on D-Bus User Message Bus Socket. Feb 1 03:14:27 localhost systemd[75555]: Reached target Sockets. Feb 1 03:14:27 localhost systemd[75555]: Reached target Basic System. Feb 1 03:14:27 localhost systemd[75555]: Reached target Main User Target. Feb 1 03:14:27 localhost systemd[75555]: Startup finished in 148ms. Feb 1 03:14:27 localhost systemd[1]: Started User Manager for UID 0. Feb 1 03:14:27 localhost systemd[1]: Started Session c10 of User root. Feb 1 03:14:27 localhost systemd[1]: session-c10.scope: Deactivated successfully. Feb 1 03:14:27 localhost podman[75642]: 2026-02-01 08:14:27.331105187 +0000 UTC m=+0.076349280 container create 75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_wait_for_compute_service, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:14:27 localhost systemd[1]: Started libpod-conmon-75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea.scope. Feb 1 03:14:27 localhost podman[75642]: 2026-02-01 08:14:27.291946753 +0000 UTC m=+0.037190936 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:14:27 localhost systemd[1]: Started libcrun container. Feb 1 03:14:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7924ed6840b9bc63ce56434aa13c39d9012b17069c1e374088a78eb283caf4a/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 1 03:14:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7924ed6840b9bc63ce56434aa13c39d9012b17069c1e374088a78eb283caf4a/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 1 03:14:27 localhost podman[75642]: 2026-02-01 08:14:27.429096909 +0000 UTC m=+0.174341002 container init 75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:14:27 localhost podman[75642]: 2026-02-01 08:14:27.441538295 +0000 UTC m=+0.186782428 container start 75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_wait_for_compute_service, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13) Feb 1 03:14:27 localhost podman[75642]: 2026-02-01 08:14:27.442009699 +0000 UTC m=+0.187253852 container attach 75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510) Feb 1 03:14:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:14:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:14:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:14:36 localhost systemd[1]: tmp-crun.qpFrym.mount: Deactivated successfully. Feb 1 03:14:36 localhost podman[75666]: 2026-02-01 08:14:36.73571732 +0000 UTC m=+0.097292053 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 1 03:14:36 localhost systemd[1]: tmp-crun.GgfVLR.mount: Deactivated successfully. Feb 1 03:14:36 localhost podman[75665]: 2026-02-01 08:14:36.786044681 +0000 UTC m=+0.146874401 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, container_name=ceilometer_agent_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510) Feb 1 03:14:36 localhost podman[75665]: 2026-02-01 08:14:36.816356597 +0000 UTC m=+0.177186367 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, maintainer=OpenStack TripleO Team) Feb 1 03:14:36 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:14:36 localhost podman[75667]: 2026-02-01 08:14:36.832745973 +0000 UTC m=+0.189378266 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.13) Feb 1 03:14:36 localhost podman[75666]: 2026-02-01 08:14:36.852727077 +0000 UTC m=+0.214301770 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Feb 1 03:14:36 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:14:36 localhost podman[75667]: 2026-02-01 08:14:36.86437393 +0000 UTC m=+0.221006183 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 1 03:14:36 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:14:37 localhost systemd[1]: Stopping User Manager for UID 0... Feb 1 03:14:37 localhost systemd[75555]: Activating special unit Exit the Session... Feb 1 03:14:37 localhost systemd[75555]: Stopped target Main User Target. Feb 1 03:14:37 localhost systemd[75555]: Stopped target Basic System. Feb 1 03:14:37 localhost systemd[75555]: Stopped target Paths. Feb 1 03:14:37 localhost systemd[75555]: Stopped target Sockets. Feb 1 03:14:37 localhost systemd[75555]: Stopped target Timers. Feb 1 03:14:37 localhost systemd[75555]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 03:14:37 localhost systemd[75555]: Closed D-Bus User Message Bus Socket. Feb 1 03:14:37 localhost systemd[75555]: Stopped Create User's Volatile Files and Directories. Feb 1 03:14:37 localhost systemd[75555]: Removed slice User Application Slice. Feb 1 03:14:37 localhost systemd[75555]: Reached target Shutdown. Feb 1 03:14:37 localhost systemd[75555]: Finished Exit the Session. Feb 1 03:14:37 localhost systemd[75555]: Reached target Exit the Session. Feb 1 03:14:37 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 1 03:14:37 localhost systemd[1]: Stopped User Manager for UID 0. Feb 1 03:14:37 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 1 03:14:37 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 1 03:14:37 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 1 03:14:37 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 1 03:14:37 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 1 03:14:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:14:38 localhost podman[75738]: 2026-02-01 08:14:38.724676161 +0000 UTC m=+0.081077172 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container) Feb 1 03:14:39 localhost podman[75738]: 2026-02-01 08:14:39.088443558 +0000 UTC m=+0.444844539 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:14:39 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:14:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:14:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:14:42 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:14:42 localhost recover_tripleo_nova_virtqemud[75774]: 61284 Feb 1 03:14:42 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:14:42 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:14:42 localhost podman[75761]: 2026-02-01 08:14:42.724785085 +0000 UTC m=+0.083939769 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13) Feb 1 03:14:42 localhost podman[75762]: 2026-02-01 08:14:42.781735857 +0000 UTC m=+0.138355085 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:14:42 localhost podman[75761]: 2026-02-01 08:14:42.781636904 +0000 UTC m=+0.140791558 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, container_name=ovn_metadata_agent, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:14:42 localhost podman[75762]: 2026-02-01 08:14:42.806435114 +0000 UTC m=+0.163054412 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510) Feb 1 03:14:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:14:42 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:14:42 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:14:42 localhost podman[75811]: 2026-02-01 08:14:42.931421862 +0000 UTC m=+0.092567750 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, distribution-scope=public) Feb 1 03:14:43 localhost podman[75811]: 2026-02-01 08:14:43.155717783 +0000 UTC m=+0.316863701 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:14:43 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:14:54 localhost sshd[75916]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:14:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:14:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:14:56 localhost systemd[1]: tmp-crun.sa9uOH.mount: Deactivated successfully. Feb 1 03:14:56 localhost podman[75919]: 2026-02-01 08:14:56.724976688 +0000 UTC m=+0.080876116 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid) Feb 1 03:14:56 localhost systemd[1]: tmp-crun.LmJ0zu.mount: Deactivated successfully. Feb 1 03:14:56 localhost podman[75919]: 2026-02-01 08:14:56.741112336 +0000 UTC m=+0.097011754 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, com.redhat.component=openstack-iscsid-container) Feb 1 03:14:56 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:14:56 localhost podman[75918]: 2026-02-01 08:14:56.751970284 +0000 UTC m=+0.107770529 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, architecture=x86_64) Feb 1 03:14:56 localhost podman[75918]: 2026-02-01 08:14:56.835712086 +0000 UTC m=+0.191512321 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:14:56 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:14:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:14:57 localhost podman[75956]: 2026-02-01 08:14:57.721968799 +0000 UTC m=+0.083152525 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, container_name=nova_compute, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container) Feb 1 03:14:57 localhost podman[75956]: 2026-02-01 08:14:57.787333905 +0000 UTC m=+0.148517581 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:14:57 localhost podman[75956]: unhealthy Feb 1 03:14:57 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:14:57 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:15:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:15:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:15:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:15:07 localhost podman[75981]: 2026-02-01 08:15:07.736145764 +0000 UTC m=+0.088223539 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:15:07 localhost podman[75981]: 2026-02-01 08:15:07.770393819 +0000 UTC m=+0.122471544 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 1 03:15:07 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:15:07 localhost podman[75979]: 2026-02-01 08:15:07.784801745 +0000 UTC m=+0.140131378 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:15:07 localhost podman[75980]: 2026-02-01 08:15:07.833413264 +0000 UTC m=+0.186782227 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, batch=17.1_20260112.1) Feb 1 03:15:07 localhost podman[75979]: 2026-02-01 08:15:07.842351415 +0000 UTC m=+0.197681018 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:15:07 localhost podman[75980]: 2026-02-01 08:15:07.84454091 +0000 UTC m=+0.197909883 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:15:07 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:15:07 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:15:08 localhost systemd[1]: tmp-crun.wxg0VV.mount: Deactivated successfully. Feb 1 03:15:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:15:09 localhost podman[76047]: 2026-02-01 08:15:09.737308574 +0000 UTC m=+0.094618192 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git) Feb 1 03:15:10 localhost podman[76047]: 2026-02-01 08:15:10.11985998 +0000 UTC m=+0.477169558 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:15:10 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:15:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:15:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:15:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:15:13 localhost podman[76072]: 2026-02-01 08:15:13.713574997 +0000 UTC m=+0.066715287 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, architecture=x86_64, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:15:13 localhost podman[76072]: 2026-02-01 08:15:13.74177682 +0000 UTC m=+0.094917160 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:15:13 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:15:13 localhost systemd[1]: tmp-crun.plJaql.mount: Deactivated successfully. Feb 1 03:15:13 localhost podman[76070]: 2026-02-01 08:15:13.842905898 +0000 UTC m=+0.200631057 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true) Feb 1 03:15:13 localhost systemd[1]: tmp-crun.QEUeDz.mount: Deactivated successfully. Feb 1 03:15:13 localhost podman[76071]: 2026-02-01 08:15:13.893294451 +0000 UTC m=+0.245869985 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:15:13 localhost podman[76071]: 2026-02-01 08:15:13.950405827 +0000 UTC m=+0.302981691 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 1 03:15:13 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:15:14 localhost podman[76070]: 2026-02-01 08:15:14.035331784 +0000 UTC m=+0.393056923 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 1 03:15:14 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:15:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:15:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:15:27 localhost systemd[1]: tmp-crun.re25Jv.mount: Deactivated successfully. Feb 1 03:15:27 localhost podman[76147]: 2026-02-01 08:15:27.782764166 +0000 UTC m=+0.139293332 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, tcib_managed=true, container_name=iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:15:27 localhost podman[76147]: 2026-02-01 08:15:27.795563453 +0000 UTC m=+0.152092639 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, release=1766032510, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:15:27 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:15:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:15:27 localhost podman[76146]: 2026-02-01 08:15:27.746872221 +0000 UTC m=+0.104068707 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13) Feb 1 03:15:27 localhost podman[76146]: 2026-02-01 08:15:27.880587404 +0000 UTC m=+0.237783850 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, container_name=collectd, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:15:27 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:15:27 localhost podman[76184]: 2026-02-01 08:15:27.931673138 +0000 UTC m=+0.086030932 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., distribution-scope=public) Feb 1 03:15:27 localhost podman[76184]: 2026-02-01 08:15:27.968403278 +0000 UTC m=+0.122761082 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true) Feb 1 03:15:27 localhost podman[76184]: unhealthy Feb 1 03:15:27 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:15:27 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:15:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:15:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:15:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:15:38 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:15:38 localhost recover_tripleo_nova_virtqemud[76227]: 61284 Feb 1 03:15:38 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:15:38 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:15:38 localhost systemd[1]: tmp-crun.tvhV7x.mount: Deactivated successfully. Feb 1 03:15:38 localhost podman[76207]: 2026-02-01 08:15:38.712121647 +0000 UTC m=+0.069452271 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, distribution-scope=public, io.openshift.expose-services=) Feb 1 03:15:38 localhost podman[76211]: 2026-02-01 08:15:38.72774718 +0000 UTC m=+0.076363931 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team) Feb 1 03:15:38 localhost podman[76211]: 2026-02-01 08:15:38.762247632 +0000 UTC m=+0.110864403 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:15:38 localhost podman[76208]: 2026-02-01 08:15:38.770633996 +0000 UTC m=+0.123935518 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:15:38 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:15:38 localhost podman[76208]: 2026-02-01 08:15:38.782371651 +0000 UTC m=+0.135673163 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:15:38 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:15:38 localhost podman[76207]: 2026-02-01 08:15:38.812746749 +0000 UTC m=+0.170077423 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510) Feb 1 03:15:38 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:15:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:15:40 localhost podman[76279]: 2026-02-01 08:15:40.711346458 +0000 UTC m=+0.073135662 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:15:41 localhost podman[76279]: 2026-02-01 08:15:41.095786631 +0000 UTC m=+0.457575835 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 1 03:15:41 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:15:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:15:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:15:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:15:44 localhost podman[76303]: 2026-02-01 08:15:44.711143231 +0000 UTC m=+0.068044449 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, tcib_managed=true, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4) Feb 1 03:15:44 localhost podman[76302]: 2026-02-01 08:15:44.765347329 +0000 UTC m=+0.124921648 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Feb 1 03:15:44 localhost podman[76303]: 2026-02-01 08:15:44.779204748 +0000 UTC m=+0.136105926 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public) Feb 1 03:15:44 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:15:44 localhost podman[76304]: 2026-02-01 08:15:44.834549382 +0000 UTC m=+0.187672745 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:15:44 localhost podman[76304]: 2026-02-01 08:15:44.859400872 +0000 UTC m=+0.212524275 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:15:44 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:15:44 localhost podman[76302]: 2026-02-01 08:15:44.962765507 +0000 UTC m=+0.322339826 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:15:44 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:15:50 localhost systemd[1]: session-27.scope: Deactivated successfully. Feb 1 03:15:50 localhost systemd[1]: session-27.scope: Consumed 2.994s CPU time. Feb 1 03:15:50 localhost systemd-logind[759]: Session 27 logged out. Waiting for processes to exit. Feb 1 03:15:50 localhost systemd-logind[759]: Removed session 27. Feb 1 03:15:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:15:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:15:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:15:58 localhost systemd[1]: tmp-crun.g4tA2R.mount: Deactivated successfully. Feb 1 03:15:58 localhost podman[76456]: 2026-02-01 08:15:58.742134779 +0000 UTC m=+0.093716343 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:15:58 localhost podman[76455]: 2026-02-01 08:15:58.756909429 +0000 UTC m=+0.110196195 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:15:58 localhost podman[76455]: 2026-02-01 08:15:58.795466773 +0000 UTC m=+0.148753539 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5) Feb 1 03:15:58 localhost systemd[1]: tmp-crun.0wwiXi.mount: Deactivated successfully. Feb 1 03:15:58 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:15:58 localhost podman[76457]: 2026-02-01 08:15:58.817635888 +0000 UTC m=+0.167721037 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, config_id=tripleo_step3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 1 03:15:58 localhost podman[76456]: 2026-02-01 08:15:58.818445092 +0000 UTC m=+0.170026706 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=nova_compute, vcs-type=git) Feb 1 03:15:58 localhost podman[76457]: 2026-02-01 08:15:58.829257012 +0000 UTC m=+0.179342171 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5) Feb 1 03:15:58 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:15:58 localhost podman[76456]: unhealthy Feb 1 03:15:58 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:15:58 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:16:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:16:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:16:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:16:09 localhost systemd[1]: tmp-crun.4LhNxq.mount: Deactivated successfully. Feb 1 03:16:09 localhost podman[76515]: 2026-02-01 08:16:09.738602226 +0000 UTC m=+0.086647368 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.13, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:16:09 localhost podman[76515]: 2026-02-01 08:16:09.747974092 +0000 UTC m=+0.096019194 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 1 03:16:09 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:16:09 localhost podman[76514]: 2026-02-01 08:16:09.788470564 +0000 UTC m=+0.139546449 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Feb 1 03:16:09 localhost podman[76514]: 2026-02-01 08:16:09.846762499 +0000 UTC m=+0.197838404 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:07:47Z, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:16:09 localhost podman[76516]: 2026-02-01 08:16:09.854117223 +0000 UTC m=+0.195614796 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, architecture=x86_64, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible) Feb 1 03:16:09 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:16:09 localhost podman[76516]: 2026-02-01 08:16:09.910565512 +0000 UTC m=+0.252063105 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true) Feb 1 03:16:09 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:16:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:16:11 localhost podman[76585]: 2026-02-01 08:16:11.71484492 +0000 UTC m=+0.073347834 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:16:12 localhost podman[76585]: 2026-02-01 08:16:12.027435306 +0000 UTC m=+0.385938220 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team) Feb 1 03:16:12 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:16:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:16:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:16:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:16:15 localhost systemd[1]: tmp-crun.aqq5aW.mount: Deactivated successfully. Feb 1 03:16:15 localhost podman[76609]: 2026-02-01 08:16:15.739532063 +0000 UTC m=+0.085249037 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z) Feb 1 03:16:15 localhost podman[76610]: 2026-02-01 08:16:15.726070273 +0000 UTC m=+0.067620749 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, tcib_managed=true, vcs-type=git) Feb 1 03:16:15 localhost systemd[1]: tmp-crun.cbBQrD.mount: Deactivated successfully. Feb 1 03:16:15 localhost podman[76608]: 2026-02-01 08:16:15.784178332 +0000 UTC m=+0.135281480 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:16:15 localhost podman[76609]: 2026-02-01 08:16:15.792302409 +0000 UTC m=+0.138019403 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, batch=17.1_20260112.1, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5) Feb 1 03:16:15 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:16:15 localhost podman[76610]: 2026-02-01 08:16:15.815328661 +0000 UTC m=+0.156879137 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:16:15 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:16:15 localhost podman[76608]: 2026-02-01 08:16:15.96740032 +0000 UTC m=+0.318503558 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, container_name=metrics_qdr, release=1766032510, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:16:15 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:16:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:16:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:16:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:16:29 localhost podman[76681]: 2026-02-01 08:16:29.735323055 +0000 UTC m=+0.089504636 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public) Feb 1 03:16:29 localhost podman[76681]: 2026-02-01 08:16:29.779695405 +0000 UTC m=+0.133876986 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:16:29 localhost podman[76681]: unhealthy Feb 1 03:16:29 localhost podman[76682]: 2026-02-01 08:16:29.78807957 +0000 UTC m=+0.138872448 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step3, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:16:29 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:16:29 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:16:29 localhost podman[76682]: 2026-02-01 08:16:29.802379627 +0000 UTC m=+0.153172525 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:16:29 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:16:29 localhost podman[76680]: 2026-02-01 08:16:29.71644522 +0000 UTC m=+0.074217011 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:16:29 localhost podman[76680]: 2026-02-01 08:16:29.845827119 +0000 UTC m=+0.203598950 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com) Feb 1 03:16:29 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:16:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:16:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:16:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:16:40 localhost systemd[1]: tmp-crun.jmPKp4.mount: Deactivated successfully. Feb 1 03:16:40 localhost podman[76742]: 2026-02-01 08:16:40.732461609 +0000 UTC m=+0.093335382 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Feb 1 03:16:40 localhost systemd[1]: tmp-crun.LQjbQ2.mount: Deactivated successfully. Feb 1 03:16:40 localhost podman[76742]: 2026-02-01 08:16:40.779284605 +0000 UTC m=+0.140158338 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 1 03:16:40 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:16:40 localhost podman[76741]: 2026-02-01 08:16:40.831531766 +0000 UTC m=+0.191786700 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron) Feb 1 03:16:40 localhost podman[76741]: 2026-02-01 08:16:40.865543891 +0000 UTC m=+0.225798885 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:16:40 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:16:40 localhost podman[76740]: 2026-02-01 08:16:40.782041559 +0000 UTC m=+0.141791778 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, vcs-type=git, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:16:40 localhost podman[76740]: 2026-02-01 08:16:40.914549493 +0000 UTC m=+0.274299662 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:07:47Z) Feb 1 03:16:40 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:16:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:16:42 localhost podman[76810]: 2026-02-01 08:16:42.723575085 +0000 UTC m=+0.084960497 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:16:43 localhost podman[76810]: 2026-02-01 08:16:43.041774652 +0000 UTC m=+0.403160084 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:16:43 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:16:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:16:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:16:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:16:46 localhost podman[76834]: 2026-02-01 08:16:46.734452238 +0000 UTC m=+0.090621210 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:16:46 localhost podman[76835]: 2026-02-01 08:16:46.719788301 +0000 UTC m=+0.077988234 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:16:46 localhost systemd[1]: tmp-crun.zeERDq.mount: Deactivated successfully. Feb 1 03:16:46 localhost podman[76833]: 2026-02-01 08:16:46.796368993 +0000 UTC m=+0.155642320 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container) Feb 1 03:16:46 localhost podman[76835]: 2026-02-01 08:16:46.804428359 +0000 UTC m=+0.162628302 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ovn_controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Feb 1 03:16:46 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:16:46 localhost podman[76834]: 2026-02-01 08:16:46.821484488 +0000 UTC m=+0.177653440 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, architecture=x86_64) Feb 1 03:16:46 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:16:47 localhost podman[76833]: 2026-02-01 08:16:47.019500825 +0000 UTC m=+0.378774102 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step1) Feb 1 03:16:47 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:16:58 localhost sshd[76989]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:17:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:17:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:17:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:17:00 localhost systemd[1]: tmp-crun.ZzAUgr.mount: Deactivated successfully. Feb 1 03:17:00 localhost podman[76993]: 2026-02-01 08:17:00.742443202 +0000 UTC m=+0.085825724 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 03:17:00 localhost podman[76993]: 2026-02-01 08:17:00.77588648 +0000 UTC m=+0.119269022 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git) Feb 1 03:17:00 localhost podman[76991]: 2026-02-01 08:17:00.780409788 +0000 UTC m=+0.127030448 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, container_name=collectd, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 1 03:17:00 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:17:00 localhost podman[76992]: 2026-02-01 08:17:00.843130267 +0000 UTC m=+0.189536491 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:17:00 localhost podman[76991]: 2026-02-01 08:17:00.865355804 +0000 UTC m=+0.211976484 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:17:00 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:17:00 localhost podman[76992]: 2026-02-01 08:17:00.904587678 +0000 UTC m=+0.250993912 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step5) Feb 1 03:17:00 localhost podman[76992]: unhealthy Feb 1 03:17:00 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:17:00 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:17:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:17:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:17:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:17:11 localhost systemd[1]: tmp-crun.s5VT4L.mount: Deactivated successfully. Feb 1 03:17:11 localhost podman[77054]: 2026-02-01 08:17:11.74717302 +0000 UTC m=+0.104248265 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.13, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:17:11 localhost podman[77054]: 2026-02-01 08:17:11.752643366 +0000 UTC m=+0.109718621 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, container_name=logrotate_crond, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, architecture=x86_64) Feb 1 03:17:11 localhost podman[77060]: 2026-02-01 08:17:11.763667422 +0000 UTC m=+0.122044816 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git) Feb 1 03:17:11 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:17:11 localhost podman[77060]: 2026-02-01 08:17:11.786375783 +0000 UTC m=+0.144753207 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git) Feb 1 03:17:11 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:17:11 localhost podman[77053]: 2026-02-01 08:17:11.702270083 +0000 UTC m=+0.067688492 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:17:11 localhost podman[77053]: 2026-02-01 08:17:11.833032854 +0000 UTC m=+0.198451213 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible) Feb 1 03:17:11 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:17:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:17:13 localhost podman[77126]: 2026-02-01 08:17:13.721122833 +0000 UTC m=+0.082821113 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public) Feb 1 03:17:14 localhost podman[77126]: 2026-02-01 08:17:14.100525363 +0000 UTC m=+0.462223603 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:17:14 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:17:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:17:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:17:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:17:17 localhost podman[77151]: 2026-02-01 08:17:17.715127003 +0000 UTC m=+0.072769146 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:17:17 localhost podman[77150]: 2026-02-01 08:17:17.787440475 +0000 UTC m=+0.148572455 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:17:17 localhost podman[77151]: 2026-02-01 08:17:17.78993564 +0000 UTC m=+0.147577783 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-type=git) Feb 1 03:17:17 localhost podman[77152]: 2026-02-01 08:17:17.830604058 +0000 UTC m=+0.185129027 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 1 03:17:17 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:17:17 localhost podman[77152]: 2026-02-01 08:17:17.853133324 +0000 UTC m=+0.207658363 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com) Feb 1 03:17:17 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:17:18 localhost podman[77150]: 2026-02-01 08:17:18.026510842 +0000 UTC m=+0.387642792 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.expose-services=, version=17.1.13) Feb 1 03:17:18 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:17:18 localhost systemd[1]: tmp-crun.QXyNRM.mount: Deactivated successfully. Feb 1 03:17:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:17:29 localhost recover_tripleo_nova_virtqemud[77228]: 61284 Feb 1 03:17:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:17:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:17:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:17:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:17:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:17:31 localhost podman[77297]: 2026-02-01 08:17:31.776866142 +0000 UTC m=+0.126682597 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, architecture=x86_64, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 1 03:17:31 localhost podman[77295]: 2026-02-01 08:17:31.782898766 +0000 UTC m=+0.138457756 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, architecture=x86_64, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:17:31 localhost podman[77295]: 2026-02-01 08:17:31.790825407 +0000 UTC m=+0.146384387 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:17:31 localhost podman[77296]: 2026-02-01 08:17:31.748521469 +0000 UTC m=+0.098302503 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 1 03:17:31 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:17:31 localhost podman[77297]: 2026-02-01 08:17:31.81126757 +0000 UTC m=+0.161084035 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, config_id=tripleo_step3, container_name=iscsid, build-date=2026-01-12T22:34:43Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 1 03:17:31 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:17:31 localhost podman[77296]: 2026-02-01 08:17:31.828732331 +0000 UTC m=+0.178513355 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:17:31 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:17:40 localhost systemd[1]: libpod-75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea.scope: Deactivated successfully. Feb 1 03:17:40 localhost podman[77385]: 2026-02-01 08:17:40.186457035 +0000 UTC m=+0.060624176 container died 75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 1 03:17:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea-userdata-shm.mount: Deactivated successfully. Feb 1 03:17:40 localhost systemd[1]: var-lib-containers-storage-overlay-f7924ed6840b9bc63ce56434aa13c39d9012b17069c1e374088a78eb283caf4a-merged.mount: Deactivated successfully. Feb 1 03:17:40 localhost podman[77385]: 2026-02-01 08:17:40.221928795 +0000 UTC m=+0.096095886 container cleanup 75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, container_name=nova_wait_for_compute_service, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:17:40 localhost systemd[1]: libpod-conmon-75ccf15597b9e81f11e0faab362c9b50b3d503956b00d24165dc31e9163d3eea.scope: Deactivated successfully. Feb 1 03:17:40 localhost python3[75482]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=9ec539c069b98a16ced7663e9b12641d --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 1 03:17:40 localhost python3[77435]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:17:41 localhost python3[77451]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 1 03:17:41 localhost python3[77512]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933861.1532013-117810-55102427376802/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:17:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:17:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:17:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:17:41 localhost podman[77530]: 2026-02-01 08:17:41.954274463 +0000 UTC m=+0.084036350 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=) Feb 1 03:17:41 localhost podman[77530]: 2026-02-01 08:17:41.98636871 +0000 UTC m=+0.116130537 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team) Feb 1 03:17:41 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:17:42 localhost systemd[1]: tmp-crun.KeSZy4.mount: Deactivated successfully. Feb 1 03:17:42 localhost podman[77528]: 2026-02-01 08:17:42.047549753 +0000 UTC m=+0.181928390 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64) Feb 1 03:17:42 localhost podman[77528]: 2026-02-01 08:17:42.064087795 +0000 UTC m=+0.198466442 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510) Feb 1 03:17:42 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:17:42 localhost podman[77529]: 2026-02-01 08:17:42.066963543 +0000 UTC m=+0.196009668 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi) Feb 1 03:17:42 localhost python3[77531]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 03:17:42 localhost systemd[1]: Reloading. Feb 1 03:17:42 localhost podman[77529]: 2026-02-01 08:17:42.150507096 +0000 UTC m=+0.279553251 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:17:42 localhost systemd-rc-local-generator[77622]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:17:42 localhost systemd-sysv-generator[77628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:17:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:17:42 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:17:43 localhost python3[77649]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:17:43 localhost systemd[1]: Reloading. Feb 1 03:17:43 localhost systemd-rc-local-generator[77676]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:17:43 localhost systemd-sysv-generator[77679]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:17:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:17:43 localhost systemd[1]: Starting nova_compute container... Feb 1 03:17:43 localhost tripleo-start-podman-container[77689]: Creating additional drop-in dependency for "nova_compute" (5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032) Feb 1 03:17:43 localhost systemd[1]: Reloading. Feb 1 03:17:44 localhost systemd-rc-local-generator[77744]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:17:44 localhost systemd-sysv-generator[77750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:17:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:17:44 localhost systemd[1]: Started nova_compute container. Feb 1 03:17:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:17:44 localhost podman[77757]: 2026-02-01 08:17:44.467696018 +0000 UTC m=+0.063946467 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 1 03:17:44 localhost python3[77809]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:17:44 localhost podman[77757]: 2026-02-01 08:17:44.817441475 +0000 UTC m=+0.413691984 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_migration_target, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:17:44 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:17:46 localhost python3[77930]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005604212 step=5 update_config_hash_only=False Feb 1 03:17:46 localhost python3[77946]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 03:17:47 localhost python3[77962]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 1 03:17:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:17:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:17:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:17:48 localhost podman[77964]: 2026-02-01 08:17:48.725719505 +0000 UTC m=+0.082016138 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13) Feb 1 03:17:48 localhost systemd[1]: tmp-crun.0SmAFX.mount: Deactivated successfully. Feb 1 03:17:48 localhost podman[77965]: 2026-02-01 08:17:48.794122048 +0000 UTC m=+0.144411838 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public) Feb 1 03:17:48 localhost podman[77964]: 2026-02-01 08:17:48.818954394 +0000 UTC m=+0.175251047 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:17:48 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:17:48 localhost podman[77963]: 2026-02-01 08:17:48.842017516 +0000 UTC m=+0.202717972 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:14Z, release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Feb 1 03:17:48 localhost podman[77965]: 2026-02-01 08:17:48.847445641 +0000 UTC m=+0.197735441 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:17:48 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:17:49 localhost podman[77963]: 2026-02-01 08:17:49.050739119 +0000 UTC m=+0.411439545 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:17:49 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:17:51 localhost sshd[78117]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:18:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:18:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:18:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:18:02 localhost podman[78119]: 2026-02-01 08:18:02.731599075 +0000 UTC m=+0.091032081 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 1 03:18:02 localhost podman[78119]: 2026-02-01 08:18:02.766010793 +0000 UTC m=+0.125443809 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd) Feb 1 03:18:02 localhost podman[78121]: 2026-02-01 08:18:02.774345006 +0000 UTC m=+0.130427451 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 1 03:18:02 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:18:02 localhost podman[78121]: 2026-02-01 08:18:02.785044322 +0000 UTC m=+0.141126677 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:18:02 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:18:02 localhost systemd[1]: tmp-crun.QFy2yY.mount: Deactivated successfully. Feb 1 03:18:02 localhost podman[78120]: 2026-02-01 08:18:02.842745429 +0000 UTC m=+0.198261577 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, container_name=nova_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:18:02 localhost podman[78120]: 2026-02-01 08:18:02.897489315 +0000 UTC m=+0.253005513 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:18:02 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:18:12 localhost sshd[78184]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:18:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:18:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:18:12 localhost systemd-logind[759]: New session 33 of user zuul. Feb 1 03:18:12 localhost systemd[1]: Started Session 33 of User zuul. Feb 1 03:18:12 localhost systemd[1]: tmp-crun.NcC76T.mount: Deactivated successfully. Feb 1 03:18:12 localhost podman[78186]: 2026-02-01 08:18:12.384922314 +0000 UTC m=+0.115679134 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public) Feb 1 03:18:12 localhost systemd[1]: tmp-crun.HjiaVP.mount: Deactivated successfully. Feb 1 03:18:12 localhost podman[78187]: 2026-02-01 08:18:12.477955134 +0000 UTC m=+0.207104316 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron) Feb 1 03:18:12 localhost podman[78187]: 2026-02-01 08:18:12.484825882 +0000 UTC m=+0.213975044 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 1 03:18:12 localhost podman[78186]: 2026-02-01 08:18:12.494779241 +0000 UTC m=+0.225536001 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:18:12 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:18:12 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:18:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:18:12 localhost podman[78274]: 2026-02-01 08:18:12.586562054 +0000 UTC m=+0.056825391 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:18:12 localhost podman[78274]: 2026-02-01 08:18:12.611411713 +0000 UTC m=+0.081675060 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:18:12 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:18:13 localhost python3[78365]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 03:18:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:18:15 localhost systemd[1]: tmp-crun.U2XV0p.mount: Deactivated successfully. Feb 1 03:18:15 localhost podman[78503]: 2026-02-01 08:18:15.730742711 +0000 UTC m=+0.091271948 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target) Feb 1 03:18:16 localhost podman[78503]: 2026-02-01 08:18:16.106878956 +0000 UTC m=+0.467408193 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, container_name=nova_migration_target, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 1 03:18:16 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:18:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:18:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:18:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:18:19 localhost podman[78576]: 2026-02-01 08:18:19.73163936 +0000 UTC m=+0.076740381 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public) Feb 1 03:18:19 localhost systemd[1]: tmp-crun.5SeijZ.mount: Deactivated successfully. Feb 1 03:18:19 localhost podman[78575]: 2026-02-01 08:18:19.796667489 +0000 UTC m=+0.142060968 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=) Feb 1 03:18:19 localhost podman[78576]: 2026-02-01 08:18:19.80170243 +0000 UTC m=+0.146803481 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc.) Feb 1 03:18:19 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:18:19 localhost podman[78577]: 2026-02-01 08:18:19.712461874 +0000 UTC m=+0.058728669 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 1 03:18:19 localhost podman[78577]: 2026-02-01 08:18:19.845305953 +0000 UTC m=+0.191572688 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, build-date=2026-01-12T22:36:40Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:18:19 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:18:20 localhost podman[78575]: 2026-02-01 08:18:20.023382264 +0000 UTC m=+0.368775743 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc.) Feb 1 03:18:20 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:18:20 localhost python3[78728]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Feb 1 03:18:27 localhost python3[78821]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Feb 1 03:18:27 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Feb 1 03:18:27 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Feb 1 03:18:27 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 03:18:27 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 03:18:27 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 03:18:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:18:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:18:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:18:33 localhost podman[78890]: 2026-02-01 08:18:33.716244083 +0000 UTC m=+0.074042729 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, config_id=tripleo_step5, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com) Feb 1 03:18:33 localhost podman[78891]: 2026-02-01 08:18:33.73306859 +0000 UTC m=+0.082871345 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, architecture=x86_64, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:18:33 localhost podman[78891]: 2026-02-01 08:18:33.769317641 +0000 UTC m=+0.119120386 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:18:33 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:18:33 localhost podman[78890]: 2026-02-01 08:18:33.801099879 +0000 UTC m=+0.158898485 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_id=tripleo_step5, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z) Feb 1 03:18:33 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:18:33 localhost podman[78889]: 2026-02-01 08:18:33.775534299 +0000 UTC m=+0.131086827 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, container_name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z) Feb 1 03:18:33 localhost podman[78889]: 2026-02-01 08:18:33.859685233 +0000 UTC m=+0.215237771 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=collectd, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Feb 1 03:18:33 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:18:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:18:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:18:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:18:42 localhost podman[78952]: 2026-02-01 08:18:42.738104401 +0000 UTC m=+0.090437163 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible) Feb 1 03:18:42 localhost podman[78954]: 2026-02-01 08:18:42.782082355 +0000 UTC m=+0.129990375 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git) Feb 1 03:18:42 localhost podman[78952]: 2026-02-01 08:18:42.797397646 +0000 UTC m=+0.149730368 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:18:42 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:18:42 localhost podman[78954]: 2026-02-01 08:18:42.811518151 +0000 UTC m=+0.159426161 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, url=https://www.redhat.com) Feb 1 03:18:42 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:18:42 localhost podman[78953]: 2026-02-01 08:18:42.892573691 +0000 UTC m=+0.240395868 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, release=1766032510, version=17.1.13, distribution-scope=public, container_name=logrotate_crond, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:18:42 localhost podman[78953]: 2026-02-01 08:18:42.92541642 +0000 UTC m=+0.273238567 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true) Feb 1 03:18:42 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:18:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:18:46 localhost podman[79023]: 2026-02-01 08:18:46.724903965 +0000 UTC m=+0.088353810 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1766032510) Feb 1 03:18:47 localhost podman[79023]: 2026-02-01 08:18:47.065091697 +0000 UTC m=+0.428541472 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.13, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:18:47 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:18:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:18:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:18:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:18:50 localhost podman[79047]: 2026-02-01 08:18:50.729980489 +0000 UTC m=+0.080634148 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:18:50 localhost podman[79046]: 2026-02-01 08:18:50.784599073 +0000 UTC m=+0.137926613 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510) Feb 1 03:18:50 localhost podman[79047]: 2026-02-01 08:18:50.808569075 +0000 UTC m=+0.159222754 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 1 03:18:50 localhost podman[79048]: 2026-02-01 08:18:50.838057853 +0000 UTC m=+0.182271599 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, container_name=ovn_controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:18:50 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:18:50 localhost podman[79048]: 2026-02-01 08:18:50.887380968 +0000 UTC m=+0.231594654 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:18:50 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:18:50 localhost podman[79046]: 2026-02-01 08:18:50.977593454 +0000 UTC m=+0.330921014 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git) Feb 1 03:18:50 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:18:59 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:18:59 localhost recover_tripleo_nova_virtqemud[79199]: 61284 Feb 1 03:18:59 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:18:59 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:19:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:19:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:19:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:19:04 localhost podman[79202]: 2026-02-01 08:19:04.706673673 +0000 UTC m=+0.062214733 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 03:19:04 localhost systemd[1]: tmp-crun.igWmSq.mount: Deactivated successfully. Feb 1 03:19:04 localhost podman[79201]: 2026-02-01 08:19:04.721419997 +0000 UTC m=+0.076039840 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, container_name=nova_compute, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:19:04 localhost podman[79202]: 2026-02-01 08:19:04.743258895 +0000 UTC m=+0.098799955 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 1 03:19:04 localhost podman[79201]: 2026-02-01 08:19:04.749377909 +0000 UTC m=+0.103997772 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:19:04 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:19:04 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:19:04 localhost systemd[1]: tmp-crun.FGlnY7.mount: Deactivated successfully. Feb 1 03:19:04 localhost podman[79200]: 2026-02-01 08:19:04.82181708 +0000 UTC m=+0.176523645 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:19:04 localhost podman[79200]: 2026-02-01 08:19:04.858554116 +0000 UTC m=+0.213260731 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step3) Feb 1 03:19:04 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:19:13 localhost sshd[79265]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:19:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:19:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:19:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:19:13 localhost podman[79268]: 2026-02-01 08:19:13.726162119 +0000 UTC m=+0.077928037 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, version=17.1.13) Feb 1 03:19:13 localhost podman[79267]: 2026-02-01 08:19:13.79129209 +0000 UTC m=+0.146673227 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:19:13 localhost podman[79268]: 2026-02-01 08:19:13.812128077 +0000 UTC m=+0.163893975 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510) Feb 1 03:19:13 localhost podman[79269]: 2026-02-01 08:19:13.835871622 +0000 UTC m=+0.185435473 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64) Feb 1 03:19:13 localhost podman[79267]: 2026-02-01 08:19:13.844958365 +0000 UTC m=+0.200339462 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:19:13 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:19:13 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:19:13 localhost podman[79269]: 2026-02-01 08:19:13.889107075 +0000 UTC m=+0.238670876 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:19:13 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:19:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:19:17 localhost podman[79342]: 2026-02-01 08:19:17.721998104 +0000 UTC m=+0.080817264 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64) Feb 1 03:19:18 localhost podman[79342]: 2026-02-01 08:19:18.099569091 +0000 UTC m=+0.458388301 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_migration_target, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:19:18 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:19:21 localhost podman[79366]: 2026-02-01 08:19:21.732161733 +0000 UTC m=+0.086117243 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, release=1766032510, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Feb 1 03:19:21 localhost podman[79366]: 2026-02-01 08:19:21.785040105 +0000 UTC m=+0.138995645 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:19:21 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:19:21 localhost podman[79365]: 2026-02-01 08:19:21.786814549 +0000 UTC m=+0.145015357 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:19:21 localhost systemd[1]: tmp-crun.mGc8qG.mount: Deactivated successfully. Feb 1 03:19:21 localhost podman[79367]: 2026-02-01 08:19:21.847816036 +0000 UTC m=+0.198465036 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:19:21 localhost podman[79367]: 2026-02-01 08:19:21.899432059 +0000 UTC m=+0.250081059 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510) Feb 1 03:19:21 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:19:21 localhost podman[79365]: 2026-02-01 08:19:21.999175063 +0000 UTC m=+0.357375871 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:19:22 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:19:27 localhost systemd[1]: session-33.scope: Deactivated successfully. Feb 1 03:19:27 localhost systemd[1]: session-33.scope: Consumed 6.245s CPU time. Feb 1 03:19:27 localhost systemd-logind[759]: Session 33 logged out. Waiting for processes to exit. Feb 1 03:19:27 localhost systemd-logind[759]: Removed session 33. Feb 1 03:19:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:19:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:19:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:19:35 localhost systemd[1]: tmp-crun.Dflwqx.mount: Deactivated successfully. Feb 1 03:19:35 localhost podman[79483]: 2026-02-01 08:19:35.735762317 +0000 UTC m=+0.082529165 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step5, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510) Feb 1 03:19:35 localhost systemd[1]: tmp-crun.40F5SY.mount: Deactivated successfully. Feb 1 03:19:35 localhost podman[79482]: 2026-02-01 08:19:35.784869186 +0000 UTC m=+0.134260523 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:19:35 localhost podman[79482]: 2026-02-01 08:19:35.797518736 +0000 UTC m=+0.146910113 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd) Feb 1 03:19:35 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:19:35 localhost podman[79484]: 2026-02-01 08:19:35.836615503 +0000 UTC m=+0.177241056 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, tcib_managed=true, url=https://www.redhat.com, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team) Feb 1 03:19:35 localhost podman[79483]: 2026-02-01 08:19:35.862563125 +0000 UTC m=+0.209329963 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Feb 1 03:19:35 localhost podman[79484]: 2026-02-01 08:19:35.873236236 +0000 UTC m=+0.213861769 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3) Feb 1 03:19:35 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:19:35 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:19:36 localhost sshd[79546]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:19:36 localhost systemd-logind[759]: New session 34 of user zuul. Feb 1 03:19:36 localhost systemd[1]: Started Session 34 of User zuul. Feb 1 03:19:36 localhost python3[79565]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 03:19:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:19:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:19:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:19:44 localhost systemd[1]: tmp-crun.XMA0AJ.mount: Deactivated successfully. Feb 1 03:19:44 localhost podman[79568]: 2026-02-01 08:19:44.745530691 +0000 UTC m=+0.096258989 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:19:44 localhost podman[79569]: 2026-02-01 08:19:44.775151563 +0000 UTC m=+0.123764157 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64) Feb 1 03:19:44 localhost podman[79568]: 2026-02-01 08:19:44.778213295 +0000 UTC m=+0.128941593 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:19:44 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:19:44 localhost podman[79569]: 2026-02-01 08:19:44.834315264 +0000 UTC m=+0.182927858 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:19:44 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:19:44 localhost podman[79567]: 2026-02-01 08:19:44.83850336 +0000 UTC m=+0.192189227 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute) Feb 1 03:19:44 localhost podman[79567]: 2026-02-01 08:19:44.922594242 +0000 UTC m=+0.276280169 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, release=1766032510, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:19:44 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:19:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:19:48 localhost systemd[1]: tmp-crun.Cr82zS.mount: Deactivated successfully. Feb 1 03:19:48 localhost podman[79641]: 2026-02-01 08:19:48.719517201 +0000 UTC m=+0.082344910 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:19:49 localhost podman[79641]: 2026-02-01 08:19:49.095233912 +0000 UTC m=+0.458061621 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:19:49 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:19:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:19:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:19:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:19:52 localhost podman[79664]: 2026-02-01 08:19:52.734070501 +0000 UTC m=+0.093099013 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:19:52 localhost podman[79666]: 2026-02-01 08:19:52.783917211 +0000 UTC m=+0.143385996 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:19:52 localhost podman[79666]: 2026-02-01 08:19:52.818955316 +0000 UTC m=+0.178424101 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, managed_by=tripleo_ansible) Feb 1 03:19:52 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:19:52 localhost podman[79665]: 2026-02-01 08:19:52.845946388 +0000 UTC m=+0.204792745 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Feb 1 03:19:52 localhost podman[79665]: 2026-02-01 08:19:52.889627514 +0000 UTC m=+0.248473871 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:19:52 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:19:52 localhost podman[79664]: 2026-02-01 08:19:52.951530538 +0000 UTC m=+0.310559080 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-type=git, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 1 03:19:52 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:19:53 localhost systemd[1]: tmp-crun.YmPFBC.mount: Deactivated successfully. Feb 1 03:20:04 localhost python3[79829]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 1 03:20:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:20:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:20:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:20:06 localhost podman[79832]: 2026-02-01 08:20:06.712426917 +0000 UTC m=+0.064278276 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:20:06 localhost podman[79833]: 2026-02-01 08:20:06.767171434 +0000 UTC m=+0.116057694 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64) Feb 1 03:20:06 localhost podman[79833]: 2026-02-01 08:20:06.781257358 +0000 UTC m=+0.130143608 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container) Feb 1 03:20:06 localhost podman[79832]: 2026-02-01 08:20:06.787656642 +0000 UTC m=+0.139507971 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, tcib_managed=true) Feb 1 03:20:06 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:20:06 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:20:06 localhost podman[79831]: 2026-02-01 08:20:06.888936531 +0000 UTC m=+0.239655657 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1766032510, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Feb 1 03:20:06 localhost podman[79831]: 2026-02-01 08:20:06.923085749 +0000 UTC m=+0.273804915 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, version=17.1.13, vcs-type=git, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Feb 1 03:20:06 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:20:07 localhost systemd[1]: tmp-crun.hK635S.mount: Deactivated successfully. Feb 1 03:20:07 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 03:20:07 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 03:20:07 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 03:20:08 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 03:20:08 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 03:20:08 localhost systemd[1]: run-r97f5b63dffdc4405b720dc1d7ce39476.service: Deactivated successfully. Feb 1 03:20:08 localhost systemd[1]: run-rb649c1b8444f4a549b443c35504c8b47.service: Deactivated successfully. Feb 1 03:20:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:20:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:20:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:20:15 localhost systemd[1]: tmp-crun.KoMS72.mount: Deactivated successfully. Feb 1 03:20:15 localhost podman[80047]: 2026-02-01 08:20:15.747608176 +0000 UTC m=+0.098784206 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:20:15 localhost podman[80048]: 2026-02-01 08:20:15.789398315 +0000 UTC m=+0.136437089 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:20:15 localhost podman[80049]: 2026-02-01 08:20:15.751393521 +0000 UTC m=+0.095813244 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:20:15 localhost podman[80048]: 2026-02-01 08:20:15.826519242 +0000 UTC m=+0.173557966 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 1 03:20:15 localhost podman[80049]: 2026-02-01 08:20:15.838552411 +0000 UTC m=+0.182972104 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64) Feb 1 03:20:15 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:20:15 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:20:15 localhost podman[80047]: 2026-02-01 08:20:15.882468635 +0000 UTC m=+0.233644715 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:20:15 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:20:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:20:19 localhost systemd[1]: tmp-crun.0NKhfW.mount: Deactivated successfully. Feb 1 03:20:19 localhost podman[80118]: 2026-02-01 08:20:19.723629751 +0000 UTC m=+0.086814290 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4) Feb 1 03:20:20 localhost podman[80118]: 2026-02-01 08:20:20.108060573 +0000 UTC m=+0.471245122 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 1 03:20:20 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:20:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:20:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:20:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:20:23 localhost podman[80141]: 2026-02-01 08:20:23.713301695 +0000 UTC m=+0.072176081 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:20:23 localhost systemd[1]: tmp-crun.mYg9TD.mount: Deactivated successfully. Feb 1 03:20:23 localhost podman[80142]: 2026-02-01 08:20:23.76835434 +0000 UTC m=+0.122769670 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4) Feb 1 03:20:23 localhost podman[80142]: 2026-02-01 08:20:23.820171828 +0000 UTC m=+0.174587148 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:20:23 localhost podman[80143]: 2026-02-01 08:20:23.826937014 +0000 UTC m=+0.177656900 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 1 03:20:23 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:20:23 localhost podman[80143]: 2026-02-01 08:20:23.846078351 +0000 UTC m=+0.196798257 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, managed_by=tripleo_ansible, container_name=ovn_controller, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4) Feb 1 03:20:23 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:20:23 localhost podman[80141]: 2026-02-01 08:20:23.902505988 +0000 UTC m=+0.261380364 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, managed_by=tripleo_ansible) Feb 1 03:20:23 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:20:24 localhost systemd[1]: tmp-crun.x4YvZn.mount: Deactivated successfully. Feb 1 03:20:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:20:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:20:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:20:37 localhost podman[80262]: 2026-02-01 08:20:37.726691292 +0000 UTC m=+0.078769912 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 03:20:37 localhost podman[80262]: 2026-02-01 08:20:37.734975906 +0000 UTC m=+0.087054526 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:20:37 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:20:37 localhost systemd[1]: tmp-crun.lTq7Sy.mount: Deactivated successfully. Feb 1 03:20:37 localhost podman[80260]: 2026-02-01 08:20:37.842979163 +0000 UTC m=+0.202100740 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, architecture=x86_64) Feb 1 03:20:37 localhost podman[80261]: 2026-02-01 08:20:37.80043502 +0000 UTC m=+0.157117752 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container) Feb 1 03:20:37 localhost podman[80260]: 2026-02-01 08:20:37.855376743 +0000 UTC m=+0.214498320 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:20:37 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:20:37 localhost podman[80261]: 2026-02-01 08:20:37.883442532 +0000 UTC m=+0.240125254 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:20:37 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:20:43 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:20:43 localhost recover_tripleo_nova_virtqemud[80343]: 61284 Feb 1 03:20:43 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:20:43 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:20:43 localhost python3[80341]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 03:20:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:20:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 5184 writes, 23K keys, 5184 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5184 writes, 559 syncs, 9.27 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:20:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:20:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:20:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:20:46 localhost systemd[1]: tmp-crun.MMThQB.mount: Deactivated successfully. Feb 1 03:20:46 localhost podman[80467]: 2026-02-01 08:20:46.751492839 +0000 UTC m=+0.089324126 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:20:46 localhost podman[80465]: 2026-02-01 08:20:46.808036561 +0000 UTC m=+0.148129657 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:20:46 localhost podman[80466]: 2026-02-01 08:20:46.773255565 +0000 UTC m=+0.114439285 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 1 03:20:46 localhost podman[80467]: 2026-02-01 08:20:46.835673896 +0000 UTC m=+0.173505173 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:20:46 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:20:46 localhost podman[80466]: 2026-02-01 08:20:46.858450785 +0000 UTC m=+0.199634545 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:20:46 localhost podman[80465]: 2026-02-01 08:20:46.8654866 +0000 UTC m=+0.205579696 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:20:46 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:20:46 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:20:47 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 03:20:47 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 03:20:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:20:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4440 writes, 20K keys, 4440 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4440 writes, 499 syncs, 8.90 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:20:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:20:50 localhost systemd[1]: tmp-crun.RduaRH.mount: Deactivated successfully. Feb 1 03:20:50 localhost podman[80604]: 2026-02-01 08:20:50.712820544 +0000 UTC m=+0.069950563 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:20:50 localhost sshd[80623]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:20:51 localhost podman[80604]: 2026-02-01 08:20:51.115719121 +0000 UTC m=+0.472849210 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:20:51 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:20:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:20:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:20:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:20:54 localhost systemd[1]: tmp-crun.WaGHad.mount: Deactivated successfully. Feb 1 03:20:54 localhost podman[80630]: 2026-02-01 08:20:54.719359786 +0000 UTC m=+0.080031222 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1766032510, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git) Feb 1 03:20:54 localhost podman[80632]: 2026-02-01 08:20:54.724844344 +0000 UTC m=+0.077128524 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, container_name=ovn_controller, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller) Feb 1 03:20:54 localhost podman[80631]: 2026-02-01 08:20:54.768290983 +0000 UTC m=+0.124664057 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=) Feb 1 03:20:54 localhost podman[80632]: 2026-02-01 08:20:54.798476048 +0000 UTC m=+0.150760298 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=ovn_controller, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:20:54 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:20:54 localhost podman[80631]: 2026-02-01 08:20:54.812643452 +0000 UTC m=+0.169016566 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ovn_metadata_agent, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:20:54 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:20:54 localhost podman[80630]: 2026-02-01 08:20:54.911785727 +0000 UTC m=+0.272457193 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:20:54 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:21:05 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:e4:df:f1 MACPROTO=0800 SRC=82.147.84.55 DST=38.102.83.179 LEN=40 TOS=0x08 PREC=0x20 TTL=242 ID=48772 PROTO=TCP SPT=53998 DPT=9090 SEQ=3459370677 ACK=0 WINDOW=1024 RES=0x00 SYN URGP=0 Feb 1 03:21:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:21:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:21:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:21:08 localhost systemd[1]: tmp-crun.YYN7H4.mount: Deactivated successfully. Feb 1 03:21:08 localhost systemd[1]: tmp-crun.L4TORN.mount: Deactivated successfully. Feb 1 03:21:08 localhost podman[80784]: 2026-02-01 08:21:08.785890522 +0000 UTC m=+0.139578725 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, vcs-type=git, com.redhat.component=openstack-collectd-container, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:21:08 localhost podman[80786]: 2026-02-01 08:21:08.747362272 +0000 UTC m=+0.095169995 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, release=1766032510, architecture=x86_64, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vendor=Red Hat, Inc.) Feb 1 03:21:08 localhost podman[80784]: 2026-02-01 08:21:08.819232283 +0000 UTC m=+0.172920416 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, config_id=tripleo_step3) Feb 1 03:21:08 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:21:08 localhost podman[80786]: 2026-02-01 08:21:08.831938182 +0000 UTC m=+0.179745895 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 1 03:21:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:21:08 localhost podman[80785]: 2026-02-01 08:21:08.8743355 +0000 UTC m=+0.226281239 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_id=tripleo_step5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, tcib_managed=true, container_name=nova_compute) Feb 1 03:21:08 localhost podman[80785]: 2026-02-01 08:21:08.929455328 +0000 UTC m=+0.281401007 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:21:08 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:21:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:21:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:21:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:21:17 localhost podman[80851]: 2026-02-01 08:21:17.742358137 +0000 UTC m=+0.096159575 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc.) Feb 1 03:21:17 localhost podman[80851]: 2026-02-01 08:21:17.773368996 +0000 UTC m=+0.127170414 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:47Z) Feb 1 03:21:17 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:21:17 localhost podman[80852]: 2026-02-01 08:21:17.78915356 +0000 UTC m=+0.142564987 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, architecture=x86_64, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13) Feb 1 03:21:17 localhost podman[80853]: 2026-02-01 08:21:17.826897635 +0000 UTC m=+0.177148425 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi) Feb 1 03:21:17 localhost podman[80852]: 2026-02-01 08:21:17.879894388 +0000 UTC m=+0.233305885 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 1 03:21:17 localhost podman[80853]: 2026-02-01 08:21:17.888684597 +0000 UTC m=+0.238935397 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.expose-services=) Feb 1 03:21:17 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:21:17 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:21:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:21:21 localhost systemd[1]: tmp-crun.OwwmPa.mount: Deactivated successfully. Feb 1 03:21:21 localhost podman[80922]: 2026-02-01 08:21:21.694981133 +0000 UTC m=+0.058802442 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:21:22 localhost podman[80922]: 2026-02-01 08:21:22.010568072 +0000 UTC m=+0.374389371 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:21:22 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:21:25 localhost sshd[80946]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:21:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:21:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:21:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:21:25 localhost podman[80950]: 2026-02-01 08:21:25.72677247 +0000 UTC m=+0.078603714 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc.) Feb 1 03:21:25 localhost podman[80950]: 2026-02-01 08:21:25.757300595 +0000 UTC m=+0.109131779 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, container_name=ovn_controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:21:25 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:21:25 localhost systemd[1]: tmp-crun.t9bNhm.mount: Deactivated successfully. Feb 1 03:21:25 localhost podman[80948]: 2026-02-01 08:21:25.778321036 +0000 UTC m=+0.132001908 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=metrics_qdr, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:21:25 localhost podman[80949]: 2026-02-01 08:21:25.846133905 +0000 UTC m=+0.198827145 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Feb 1 03:21:25 localhost podman[80949]: 2026-02-01 08:21:25.898522417 +0000 UTC m=+0.251215677 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:21:25 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:21:25 localhost podman[80948]: 2026-02-01 08:21:25.998172552 +0000 UTC m=+0.351853374 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1) Feb 1 03:21:26 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:21:26 localhost systemd[1]: tmp-crun.EItBkt.mount: Deactivated successfully. Feb 1 03:21:36 localhost python3[81085]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 03:21:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:21:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:21:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:21:39 localhost podman[81207]: 2026-02-01 08:21:39.74483227 +0000 UTC m=+0.095313751 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, release=1766032510, io.openshift.expose-services=) Feb 1 03:21:39 localhost systemd[1]: tmp-crun.O5EBz1.mount: Deactivated successfully. Feb 1 03:21:39 localhost podman[81208]: 2026-02-01 08:21:39.79814032 +0000 UTC m=+0.148067855 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:21:39 localhost podman[81207]: 2026-02-01 08:21:39.82980878 +0000 UTC m=+0.180290251 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 1 03:21:39 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:21:39 localhost podman[81208]: 2026-02-01 08:21:39.849548172 +0000 UTC m=+0.199475667 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.buildah.version=1.41.5) Feb 1 03:21:39 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:21:39 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 03:21:39 localhost podman[81209]: 2026-02-01 08:21:39.93638989 +0000 UTC m=+0.282593229 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, release=1766032510) Feb 1 03:21:39 localhost podman[81209]: 2026-02-01 08:21:39.974555951 +0000 UTC m=+0.320759350 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:21:39 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:21:39 localhost rhsm-service[6579]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 1 03:21:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:21:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:21:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:21:48 localhost systemd[1]: tmp-crun.mXORoK.mount: Deactivated successfully. Feb 1 03:21:48 localhost podman[81338]: 2026-02-01 08:21:48.73993346 +0000 UTC m=+0.090998258 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com) Feb 1 03:21:48 localhost podman[81336]: 2026-02-01 08:21:48.713251074 +0000 UTC m=+0.073414574 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, io.openshift.expose-services=, release=1766032510, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute) Feb 1 03:21:48 localhost podman[81337]: 2026-02-01 08:21:48.76385364 +0000 UTC m=+0.121624846 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, distribution-scope=public, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 1 03:21:48 localhost podman[81337]: 2026-02-01 08:21:48.77062675 +0000 UTC m=+0.128397966 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z) Feb 1 03:21:48 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:21:48 localhost podman[81336]: 2026-02-01 08:21:48.844029042 +0000 UTC m=+0.204192552 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Feb 1 03:21:48 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:21:48 localhost podman[81338]: 2026-02-01 08:21:48.867473008 +0000 UTC m=+0.218537836 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510) Feb 1 03:21:48 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:21:49 localhost systemd[1]: tmp-crun.Z6przy.mount: Deactivated successfully. Feb 1 03:21:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:21:52 localhost podman[81409]: 2026-02-01 08:21:52.717867239 +0000 UTC m=+0.078147490 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5) Feb 1 03:21:53 localhost podman[81409]: 2026-02-01 08:21:53.11911203 +0000 UTC m=+0.479392271 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, container_name=nova_migration_target, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:21:53 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:21:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:21:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:21:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:21:56 localhost podman[81432]: 2026-02-01 08:21:56.714692814 +0000 UTC m=+0.073354151 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc.) Feb 1 03:21:56 localhost podman[81433]: 2026-02-01 08:21:56.774460404 +0000 UTC m=+0.131022287 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc.) Feb 1 03:21:56 localhost podman[81433]: 2026-02-01 08:21:56.823030377 +0000 UTC m=+0.179592210 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true) Feb 1 03:21:56 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:21:56 localhost podman[81434]: 2026-02-01 08:21:56.832934034 +0000 UTC m=+0.184430580 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:36:40Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.buildah.version=1.41.5) Feb 1 03:21:56 localhost podman[81432]: 2026-02-01 08:21:56.89447731 +0000 UTC m=+0.253138657 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, tcib_managed=true) Feb 1 03:21:56 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:21:56 localhost podman[81434]: 2026-02-01 08:21:56.916542973 +0000 UTC m=+0.268039529 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64) Feb 1 03:21:56 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:22:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:22:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:22:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:22:10 localhost systemd[1]: tmp-crun.FiLq4T.mount: Deactivated successfully. Feb 1 03:22:10 localhost podman[81635]: 2026-02-01 08:22:10.738965565 +0000 UTC m=+0.093912278 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:22:10 localhost podman[81635]: 2026-02-01 08:22:10.750054539 +0000 UTC m=+0.105001312 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step3) Feb 1 03:22:10 localhost podman[81637]: 2026-02-01 08:22:10.787540319 +0000 UTC m=+0.137655352 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z) Feb 1 03:22:10 localhost podman[81637]: 2026-02-01 08:22:10.795720012 +0000 UTC m=+0.145835055 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:22:10 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:22:10 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:22:10 localhost podman[81636]: 2026-02-01 08:22:10.938607955 +0000 UTC m=+0.291703661 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13) Feb 1 03:22:10 localhost podman[81636]: 2026-02-01 08:22:10.99464618 +0000 UTC m=+0.347741856 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, release=1766032510, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, maintainer=OpenStack TripleO Team) Feb 1 03:22:11 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:22:11 localhost python3[81715]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Feb 1 03:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:22:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:22:19 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:22:19 localhost recover_tripleo_nova_virtqemud[81731]: 61284 Feb 1 03:22:19 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:22:19 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:22:19 localhost systemd[1]: tmp-crun.HpxUBY.mount: Deactivated successfully. Feb 1 03:22:19 localhost podman[81718]: 2026-02-01 08:22:19.716607597 +0000 UTC m=+0.076264083 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Feb 1 03:22:19 localhost podman[81718]: 2026-02-01 08:22:19.757619325 +0000 UTC m=+0.117275821 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, distribution-scope=public, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, architecture=x86_64) Feb 1 03:22:19 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:22:19 localhost podman[81719]: 2026-02-01 08:22:19.772249519 +0000 UTC m=+0.125109894 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1) Feb 1 03:22:19 localhost podman[81717]: 2026-02-01 08:22:19.827505299 +0000 UTC m=+0.186550365 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:22:19 localhost podman[81717]: 2026-02-01 08:22:19.861267944 +0000 UTC m=+0.220312960 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:22:19 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:22:19 localhost podman[81719]: 2026-02-01 08:22:19.883211613 +0000 UTC m=+0.236071958 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=) Feb 1 03:22:19 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:22:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:22:23 localhost podman[81794]: 2026-02-01 08:22:23.736211136 +0000 UTC m=+0.094336561 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:22:24 localhost podman[81794]: 2026-02-01 08:22:24.127660842 +0000 UTC m=+0.485786267 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git) Feb 1 03:22:24 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:22:27 localhost podman[81819]: 2026-02-01 08:22:27.719511445 +0000 UTC m=+0.069974427 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:22:27 localhost podman[81819]: 2026-02-01 08:22:27.740821418 +0000 UTC m=+0.091284400 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 1 03:22:27 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:22:27 localhost podman[81817]: 2026-02-01 08:22:27.781051029 +0000 UTC m=+0.136293119 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 1 03:22:27 localhost podman[81818]: 2026-02-01 08:22:27.826020307 +0000 UTC m=+0.179650247 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, architecture=x86_64) Feb 1 03:22:27 localhost podman[81818]: 2026-02-01 08:22:27.86307499 +0000 UTC m=+0.216704270 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=) Feb 1 03:22:27 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:22:27 localhost podman[81817]: 2026-02-01 08:22:27.974482174 +0000 UTC m=+0.329724284 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:22:27 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:22:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:22:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:22:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:22:41 localhost podman[81936]: 2026-02-01 08:22:41.737525955 +0000 UTC m=+0.089029369 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-collectd, container_name=collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 1 03:22:41 localhost podman[81936]: 2026-02-01 08:22:41.774452334 +0000 UTC m=+0.125955728 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd, distribution-scope=public) Feb 1 03:22:41 localhost systemd[1]: tmp-crun.2EOsY4.mount: Deactivated successfully. Feb 1 03:22:41 localhost podman[81938]: 2026-02-01 08:22:41.790833313 +0000 UTC m=+0.135444863 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Feb 1 03:22:41 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:22:41 localhost podman[81938]: 2026-02-01 08:22:41.802346532 +0000 UTC m=+0.146958122 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 1 03:22:41 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:22:41 localhost podman[81937]: 2026-02-01 08:22:41.891050779 +0000 UTC m=+0.238162237 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, container_name=nova_compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:22:41 localhost podman[81937]: 2026-02-01 08:22:41.945526184 +0000 UTC m=+0.292637742 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:22:41 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:22:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:22:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:22:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:22:50 localhost podman[82003]: 2026-02-01 08:22:50.719877794 +0000 UTC m=+0.075546820 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:22:50 localhost podman[82001]: 2026-02-01 08:22:50.766204764 +0000 UTC m=+0.127835495 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 1 03:22:50 localhost podman[82003]: 2026-02-01 08:22:50.793163353 +0000 UTC m=+0.148832419 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:22:50 localhost podman[82002]: 2026-02-01 08:22:50.833520817 +0000 UTC m=+0.189159232 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510) Feb 1 03:22:50 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:22:50 localhost podman[82002]: 2026-02-01 08:22:50.871693015 +0000 UTC m=+0.227331450 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Feb 1 03:22:50 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:22:50 localhost podman[82001]: 2026-02-01 08:22:50.899314334 +0000 UTC m=+0.260945065 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, release=1766032510, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:22:50 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:22:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:22:54 localhost systemd[1]: tmp-crun.ajW1Su.mount: Deactivated successfully. Feb 1 03:22:54 localhost podman[82075]: 2026-02-01 08:22:54.731609924 +0000 UTC m=+0.093278032 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:22:55 localhost podman[82075]: 2026-02-01 08:22:55.10451192 +0000 UTC m=+0.466180018 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=) Feb 1 03:22:55 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:22:58 localhost podman[82096]: 2026-02-01 08:22:58.791721419 +0000 UTC m=+0.149863011 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:22:58 localhost podman[82098]: 2026-02-01 08:22:58.751557531 +0000 UTC m=+0.102412327 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510) Feb 1 03:22:58 localhost podman[82097]: 2026-02-01 08:22:58.716739007 +0000 UTC m=+0.073980071 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:22:58 localhost podman[82098]: 2026-02-01 08:22:58.832432666 +0000 UTC m=+0.183287422 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc.) Feb 1 03:22:58 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:22:58 localhost podman[82097]: 2026-02-01 08:22:58.85252849 +0000 UTC m=+0.209769514 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, url=https://www.redhat.com) Feb 1 03:22:58 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:22:58 localhost podman[82096]: 2026-02-01 08:22:58.9962711 +0000 UTC m=+0.354412722 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13) Feb 1 03:22:59 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:23:11 localhost systemd[1]: session-34.scope: Deactivated successfully. Feb 1 03:23:11 localhost systemd[1]: session-34.scope: Consumed 20.883s CPU time. Feb 1 03:23:11 localhost systemd-logind[759]: Session 34 logged out. Waiting for processes to exit. Feb 1 03:23:11 localhost systemd-logind[759]: Removed session 34. Feb 1 03:23:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:23:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:23:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:23:12 localhost podman[82249]: 2026-02-01 08:23:12.74356366 +0000 UTC m=+0.091788994 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vendor=Red Hat, Inc.) Feb 1 03:23:12 localhost podman[82248]: 2026-02-01 08:23:12.79563144 +0000 UTC m=+0.142750100 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:15Z, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 1 03:23:12 localhost podman[82249]: 2026-02-01 08:23:12.803588327 +0000 UTC m=+0.151813721 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:23:12 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:23:12 localhost podman[82248]: 2026-02-01 08:23:12.859281159 +0000 UTC m=+0.206399559 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z) Feb 1 03:23:12 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:23:12 localhost podman[82250]: 2026-02-01 08:23:12.951223189 +0000 UTC m=+0.295635165 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:23:12 localhost podman[82250]: 2026-02-01 08:23:12.985582577 +0000 UTC m=+0.329994563 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 1 03:23:12 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:23:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:23:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:23:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:23:21 localhost systemd[1]: tmp-crun.h7AC7j.mount: Deactivated successfully. Feb 1 03:23:21 localhost podman[82313]: 2026-02-01 08:23:21.743138905 +0000 UTC m=+0.099703432 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Feb 1 03:23:21 localhost podman[82313]: 2026-02-01 08:23:21.778470604 +0000 UTC m=+0.135035121 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Feb 1 03:23:21 localhost systemd[1]: tmp-crun.ylrNoB.mount: Deactivated successfully. Feb 1 03:23:21 localhost podman[82314]: 2026-02-01 08:23:21.790734865 +0000 UTC m=+0.143383850 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:23:21 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:23:21 localhost podman[82314]: 2026-02-01 08:23:21.800308152 +0000 UTC m=+0.152957177 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, container_name=logrotate_crond, distribution-scope=public) Feb 1 03:23:21 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:23:21 localhost podman[82315]: 2026-02-01 08:23:21.853727973 +0000 UTC m=+0.200213476 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, container_name=ceilometer_agent_ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 1 03:23:21 localhost podman[82315]: 2026-02-01 08:23:21.907721223 +0000 UTC m=+0.254206786 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:23:21 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:23:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:23:25 localhost podman[82386]: 2026-02-01 08:23:25.703399664 +0000 UTC m=+0.069607565 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z) Feb 1 03:23:26 localhost podman[82386]: 2026-02-01 08:23:26.095660902 +0000 UTC m=+0.461868763 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.buildah.version=1.41.5) Feb 1 03:23:26 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:23:28 localhost sshd[82410]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:23:29 localhost podman[82414]: 2026-02-01 08:23:29.735345365 +0000 UTC m=+0.085975790 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true) Feb 1 03:23:29 localhost podman[82413]: 2026-02-01 08:23:29.779488575 +0000 UTC m=+0.131909875 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:23:29 localhost podman[82414]: 2026-02-01 08:23:29.78324409 +0000 UTC m=+0.133874485 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:23:29 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:23:29 localhost podman[82413]: 2026-02-01 08:23:29.829555277 +0000 UTC m=+0.181976607 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com) Feb 1 03:23:29 localhost podman[82412]: 2026-02-01 08:23:29.843121394 +0000 UTC m=+0.195651497 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team) Feb 1 03:23:29 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:23:30 localhost podman[82412]: 2026-02-01 08:23:30.010112928 +0000 UTC m=+0.362642961 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, config_id=tripleo_step1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, container_name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 1 03:23:30 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:23:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:23:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:23:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:23:43 localhost podman[82535]: 2026-02-01 08:23:43.723817146 +0000 UTC m=+0.079209411 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:23:43 localhost systemd[1]: tmp-crun.1grDQ2.mount: Deactivated successfully. Feb 1 03:23:43 localhost podman[82535]: 2026-02-01 08:23:43.764482939 +0000 UTC m=+0.119875254 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, name=rhosp-rhel9/openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:23:43 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:23:43 localhost podman[82537]: 2026-02-01 08:23:43.752886362 +0000 UTC m=+0.099656851 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, build-date=2026-01-12T22:34:43Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, distribution-scope=public, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:23:43 localhost podman[82536]: 2026-02-01 08:23:43.806674799 +0000 UTC m=+0.159433092 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:23:43 localhost podman[82537]: 2026-02-01 08:23:43.837470787 +0000 UTC m=+0.184241236 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, config_id=tripleo_step3, container_name=iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:23:43 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:23:43 localhost podman[82536]: 2026-02-01 08:23:43.862202899 +0000 UTC m=+0.214961182 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:23:43 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:23:49 localhost sshd[82600]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:23:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:23:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:23:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:23:52 localhost podman[82602]: 2026-02-01 08:23:52.738589711 +0000 UTC m=+0.093396587 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:23:52 localhost podman[82602]: 2026-02-01 08:23:52.769334518 +0000 UTC m=+0.124141414 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, release=1766032510) Feb 1 03:23:52 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:23:52 localhost podman[82604]: 2026-02-01 08:23:52.789063506 +0000 UTC m=+0.138312011 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:23:52 localhost podman[82604]: 2026-02-01 08:23:52.843891955 +0000 UTC m=+0.193140450 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ceilometer_agent_ipmi) Feb 1 03:23:52 localhost podman[82603]: 2026-02-01 08:23:52.844126892 +0000 UTC m=+0.198760083 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public) Feb 1 03:23:52 localhost podman[82603]: 2026-02-01 08:23:52.877700936 +0000 UTC m=+0.232334097 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:23:52 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:23:52 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:23:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:23:56 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:23:56 localhost recover_tripleo_nova_virtqemud[82679]: 61284 Feb 1 03:23:56 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:23:56 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:23:56 localhost podman[82677]: 2026-02-01 08:23:56.724763773 +0000 UTC m=+0.083696409 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13) Feb 1 03:23:57 localhost podman[82677]: 2026-02-01 08:23:57.094417099 +0000 UTC m=+0.453349725 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1) Feb 1 03:23:57 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:24:00 localhost podman[82703]: 2026-02-01 08:24:00.701575397 +0000 UTC m=+0.060239217 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 1 03:24:00 localhost systemd[1]: tmp-crun.xBTZb6.mount: Deactivated successfully. Feb 1 03:24:00 localhost podman[82704]: 2026-02-01 08:24:00.762971189 +0000 UTC m=+0.126016153 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:24:00 localhost podman[82705]: 2026-02-01 08:24:00.814772824 +0000 UTC m=+0.174250488 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container) Feb 1 03:24:00 localhost podman[82705]: 2026-02-01 08:24:00.833541692 +0000 UTC m=+0.193019306 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 1 03:24:00 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:24:00 localhost podman[82704]: 2026-02-01 08:24:00.869796789 +0000 UTC m=+0.232841813 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Feb 1 03:24:00 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:24:00 localhost podman[82703]: 2026-02-01 08:24:00.902836047 +0000 UTC m=+0.261499867 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_id=tripleo_step1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:24:00 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:24:01 localhost systemd[1]: tmp-crun.Eeg9fH.mount: Deactivated successfully. Feb 1 03:24:04 localhost podman[82876]: 2026-02-01 08:24:04.323390198 +0000 UTC m=+0.088470297 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.buildah.version=1.41.4, name=rhceph, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=1764794109, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Feb 1 03:24:04 localhost podman[82876]: 2026-02-01 08:24:04.420600731 +0000 UTC m=+0.185680860 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.openshift.expose-services=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, release=1764794109, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.41.4, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 03:24:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:24:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:24:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:24:14 localhost systemd[1]: tmp-crun.jwba9U.mount: Deactivated successfully. Feb 1 03:24:14 localhost podman[83098]: 2026-02-01 08:24:14.78777272 +0000 UTC m=+0.134929897 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:24:14 localhost podman[83097]: 2026-02-01 08:24:14.758611202 +0000 UTC m=+0.109140443 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:24:14 localhost podman[83098]: 2026-02-01 08:24:14.840629378 +0000 UTC m=+0.187786565 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, url=https://www.redhat.com, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:24:14 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:24:14 localhost podman[83097]: 2026-02-01 08:24:14.89165969 +0000 UTC m=+0.242188911 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=collectd) Feb 1 03:24:14 localhost podman[83099]: 2026-02-01 08:24:14.902510334 +0000 UTC m=+0.245122002 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 03:24:14 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:24:14 localhost podman[83099]: 2026-02-01 08:24:14.943447546 +0000 UTC m=+0.286059214 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:24:14 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:24:20 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 1 03:24:20 localhost systemd[1]: Created slice User Slice of UID 0. Feb 1 03:24:20 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 1 03:24:20 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 1 03:24:20 localhost systemd[1]: Starting User Manager for UID 0... Feb 1 03:24:20 localhost systemd[83465]: Queued start job for default target Main User Target. Feb 1 03:24:20 localhost systemd[83465]: Created slice User Application Slice. Feb 1 03:24:20 localhost systemd[83465]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 1 03:24:20 localhost systemd[83465]: Started Daily Cleanup of User's Temporary Directories. Feb 1 03:24:20 localhost systemd[83465]: Reached target Paths. Feb 1 03:24:20 localhost systemd[83465]: Reached target Timers. Feb 1 03:24:20 localhost systemd[83465]: Starting D-Bus User Message Bus Socket... Feb 1 03:24:20 localhost systemd[83465]: Starting Create User's Volatile Files and Directories... Feb 1 03:24:20 localhost systemd[83465]: Finished Create User's Volatile Files and Directories. Feb 1 03:24:20 localhost systemd[83465]: Listening on D-Bus User Message Bus Socket. Feb 1 03:24:20 localhost systemd[83465]: Reached target Sockets. Feb 1 03:24:20 localhost systemd[83465]: Reached target Basic System. Feb 1 03:24:20 localhost systemd[83465]: Reached target Main User Target. Feb 1 03:24:20 localhost systemd[83465]: Startup finished in 124ms. Feb 1 03:24:20 localhost systemd[1]: Started User Manager for UID 0. Feb 1 03:24:20 localhost systemd[1]: Started Session c11 of User root. Feb 1 03:24:21 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Feb 1 03:24:21 localhost kernel: device tap09cac1be-46 entered promiscuous mode Feb 1 03:24:21 localhost NetworkManager[5964]: [1769934261.5702] manager: (tap09cac1be-46): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Feb 1 03:24:21 localhost systemd-udevd[83500]: Network interface NamePolicy= disabled on kernel command line. Feb 1 03:24:21 localhost NetworkManager[5964]: [1769934261.5940] device (tap09cac1be-46): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 1 03:24:21 localhost NetworkManager[5964]: [1769934261.5949] device (tap09cac1be-46): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Feb 1 03:24:21 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 1 03:24:21 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Feb 1 03:24:21 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Feb 1 03:24:21 localhost systemd-machined[83507]: New machine qemu-1-instance-00000002. Feb 1 03:24:21 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Feb 1 03:24:21 localhost NetworkManager[5964]: [1769934261.8575] manager: (tap8bdf8183-80): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Feb 1 03:24:21 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8bdf8183-81: link becomes ready Feb 1 03:24:21 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8bdf8183-80: link becomes ready Feb 1 03:24:21 localhost NetworkManager[5964]: [1769934261.9167] device (tap8bdf8183-80): carrier: link connected Feb 1 03:24:22 localhost kernel: device tap8bdf8183-80 entered promiscuous mode Feb 1 03:24:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:24:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:24:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:24:23 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 1 03:24:23 localhost systemd[1]: tmp-crun.UvHj2V.mount: Deactivated successfully. Feb 1 03:24:23 localhost podman[83602]: 2026-02-01 08:24:23.482713494 +0000 UTC m=+0.168353796 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:24:23 localhost podman[83601]: 2026-02-01 08:24:23.438649897 +0000 UTC m=+0.124756614 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:24:23 localhost podman[83601]: 2026-02-01 08:24:23.517479845 +0000 UTC m=+0.203586572 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Feb 1 03:24:23 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:24:23 localhost podman[83603]: 2026-02-01 08:24:23.566462874 +0000 UTC m=+0.251588751 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Feb 1 03:24:23 localhost podman[83602]: 2026-02-01 08:24:23.619539199 +0000 UTC m=+0.305179471 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4) Feb 1 03:24:23 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:24:23 localhost podman[83603]: 2026-02-01 08:24:23.673668666 +0000 UTC m=+0.358794573 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, io.openshift.expose-services=, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4) Feb 1 03:24:23 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:24:23 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 1 03:24:23 localhost podman[83699]: 2026-02-01 08:24:23.769752485 +0000 UTC m=+0.085243106 container create 0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:24:23 localhost podman[83699]: 2026-02-01 08:24:23.719978773 +0000 UTC m=+0.035469384 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 1 03:24:23 localhost systemd[1]: Started libpod-conmon-0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94.scope. Feb 1 03:24:23 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Feb 1 03:24:23 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Feb 1 03:24:23 localhost systemd[1]: Started libcrun container. Feb 1 03:24:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f4519e1d60938f46fa5b10f3261da193174045579639569b6894dc5164c3378/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 03:24:23 localhost podman[83699]: 2026-02-01 08:24:23.861685507 +0000 UTC m=+0.177176148 container init 0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:24:23 localhost podman[83699]: 2026-02-01 08:24:23.868231298 +0000 UTC m=+0.183721939 container start 0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:24:24 localhost systemd[1]: tmp-crun.taZTTo.mount: Deactivated successfully. Feb 1 03:24:24 localhost setroubleshoot[83604]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 77c42ba3-11b3-418c-ae14-6a879a7ca831 Feb 1 03:24:24 localhost setroubleshoot[83604]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Feb 1 03:24:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:24:27 localhost systemd[1]: tmp-crun.C8NzpZ.mount: Deactivated successfully. Feb 1 03:24:27 localhost podman[83732]: 2026-02-01 08:24:27.760024834 +0000 UTC m=+0.103514859 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, release=1766032510, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, version=17.1.13, container_name=nova_migration_target, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 1 03:24:28 localhost podman[83732]: 2026-02-01 08:24:28.151542633 +0000 UTC m=+0.495032618 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=) Feb 1 03:24:28 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:24:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:24:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:24:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:24:31 localhost systemd[1]: tmp-crun.hV7ijT.mount: Deactivated successfully. Feb 1 03:24:31 localhost podman[83756]: 2026-02-01 08:24:31.719287577 +0000 UTC m=+0.079848341 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4, container_name=ovn_metadata_agent) Feb 1 03:24:31 localhost podman[83755]: 2026-02-01 08:24:31.761458016 +0000 UTC m=+0.124876588 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, version=17.1.13, maintainer=OpenStack TripleO Team) Feb 1 03:24:31 localhost podman[83756]: 2026-02-01 08:24:31.765245412 +0000 UTC m=+0.125806166 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:24:31 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:24:31 localhost podman[83757]: 2026-02-01 08:24:31.809825506 +0000 UTC m=+0.166126028 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:24:31 localhost podman[83757]: 2026-02-01 08:24:31.836346473 +0000 UTC m=+0.192647085 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z) Feb 1 03:24:31 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:24:31 localhost podman[83755]: 2026-02-01 08:24:31.946163675 +0000 UTC m=+0.309582287 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public) Feb 1 03:24:31 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:24:32 localhost systemd[1]: tmp-crun.a6dRV8.mount: Deactivated successfully. Feb 1 03:24:34 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Feb 1 03:24:34 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.010s CPU time. Feb 1 03:24:34 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 1 03:24:36 localhost snmpd[66800]: empty variable list in _query Feb 1 03:24:36 localhost snmpd[66800]: empty variable list in _query Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34488 [01/Feb/2026:08:24:41.829] listener listener/metadata 0/0/0/1504/1504 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34492 [01/Feb/2026:08:24:43.442] listener listener/metadata 0/0/0/23/23 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34496 [01/Feb/2026:08:24:43.513] listener listener/metadata 0/0/0/14/14 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34504 [01/Feb/2026:08:24:43.572] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34514 [01/Feb/2026:08:24:43.676] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34528 [01/Feb/2026:08:24:43.733] listener listener/metadata 0/0/0/15/15 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34530 [01/Feb/2026:08:24:43.790] listener listener/metadata 0/0/0/15/15 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34540 [01/Feb/2026:08:24:43.866] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Feb 1 03:24:43 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34550 [01/Feb/2026:08:24:43.933] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Feb 1 03:24:44 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34558 [01/Feb/2026:08:24:43.998] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Feb 1 03:24:44 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34568 [01/Feb/2026:08:24:44.057] listener listener/metadata 0/0/0/12/12 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Feb 1 03:24:44 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34570 [01/Feb/2026:08:24:44.105] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Feb 1 03:24:44 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34572 [01/Feb/2026:08:24:44.150] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Feb 1 03:24:44 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34588 [01/Feb/2026:08:24:44.193] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Feb 1 03:24:44 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34592 [01/Feb/2026:08:24:44.249] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Feb 1 03:24:44 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[83727]: 192.168.0.12:34606 [01/Feb/2026:08:24:44.306] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Feb 1 03:24:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:24:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:24:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:24:45 localhost systemd[1]: tmp-crun.t3fztb.mount: Deactivated successfully. Feb 1 03:24:45 localhost podman[83878]: 2026-02-01 08:24:45.72068518 +0000 UTC m=+0.081415369 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:24:45 localhost systemd[1]: tmp-crun.JBAwRh.mount: Deactivated successfully. Feb 1 03:24:45 localhost podman[83879]: 2026-02-01 08:24:45.731692749 +0000 UTC m=+0.086836116 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, release=1766032510, tcib_managed=true, container_name=nova_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Feb 1 03:24:45 localhost podman[83880]: 2026-02-01 08:24:45.779178801 +0000 UTC m=+0.130313385 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, container_name=iscsid, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc.) Feb 1 03:24:45 localhost podman[83880]: 2026-02-01 08:24:45.783495944 +0000 UTC m=+0.134630558 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z) Feb 1 03:24:45 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:24:45 localhost podman[83879]: 2026-02-01 08:24:45.83432732 +0000 UTC m=+0.189470647 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com) Feb 1 03:24:45 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:24:45 localhost podman[83878]: 2026-02-01 08:24:45.885644241 +0000 UTC m=+0.246374420 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, container_name=collectd, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13) Feb 1 03:24:45 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:24:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:24:53 localhost podman[83942]: 2026-02-01 08:24:53.737315509 +0000 UTC m=+0.096864525 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:24:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:24:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:24:53 localhost podman[83942]: 2026-02-01 08:24:53.768751887 +0000 UTC m=+0.128300823 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:24:53 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:24:53 localhost podman[83961]: 2026-02-01 08:24:53.848405991 +0000 UTC m=+0.092439799 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64) Feb 1 03:24:53 localhost podman[83961]: 2026-02-01 08:24:53.887875506 +0000 UTC m=+0.131909274 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:24:53 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:24:53 localhost podman[83963]: 2026-02-01 08:24:53.899131563 +0000 UTC m=+0.137041361 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:24:53 localhost podman[83963]: 2026-02-01 08:24:53.926151706 +0000 UTC m=+0.164061484 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 1 03:24:53 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:24:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:24:58 localhost podman[84014]: 2026-02-01 08:24:58.699104643 +0000 UTC m=+0.065883721 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4) Feb 1 03:24:59 localhost podman[84014]: 2026-02-01 08:24:59.059519244 +0000 UTC m=+0.426298392 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z) Feb 1 03:24:59 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:25:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:25:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:25:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:25:02 localhost systemd[1]: tmp-crun.CnKV1e.mount: Deactivated successfully. Feb 1 03:25:02 localhost podman[84040]: 2026-02-01 08:25:02.725278407 +0000 UTC m=+0.075666121 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1) Feb 1 03:25:02 localhost podman[84038]: 2026-02-01 08:25:02.74033216 +0000 UTC m=+0.097559545 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, container_name=metrics_qdr, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z) Feb 1 03:25:02 localhost podman[84039]: 2026-02-01 08:25:02.797257265 +0000 UTC m=+0.148759143 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.13, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:25:02 localhost podman[84040]: 2026-02-01 08:25:02.825572917 +0000 UTC m=+0.175960621 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, container_name=ovn_controller, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:25:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:25:02 localhost podman[84039]: 2026-02-01 08:25:02.849816794 +0000 UTC m=+0.201318602 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:25:02 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:25:02 localhost podman[84038]: 2026-02-01 08:25:02.922247024 +0000 UTC m=+0.279474419 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=metrics_qdr, config_id=tripleo_step1) Feb 1 03:25:02 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:25:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:25:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:25:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:25:16 localhost podman[84192]: 2026-02-01 08:25:16.763452292 +0000 UTC m=+0.113750364 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T23:32:04Z, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 1 03:25:16 localhost podman[84192]: 2026-02-01 08:25:16.788629348 +0000 UTC m=+0.138927390 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5) Feb 1 03:25:16 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:25:16 localhost podman[84191]: 2026-02-01 08:25:16.882581542 +0000 UTC m=+0.234260017 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 1 03:25:16 localhost podman[84191]: 2026-02-01 08:25:16.920133518 +0000 UTC m=+0.271811983 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.5) Feb 1 03:25:16 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:25:16 localhost podman[84193]: 2026-02-01 08:25:16.947875413 +0000 UTC m=+0.295184343 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 1 03:25:16 localhost podman[84193]: 2026-02-01 08:25:16.963447803 +0000 UTC m=+0.310756693 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:25:16 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:25:17 localhost systemd[1]: tmp-crun.sIJewb.mount: Deactivated successfully. Feb 1 03:25:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:25:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:25:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:25:24 localhost podman[84255]: 2026-02-01 08:25:24.707314581 +0000 UTC m=+0.067197841 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4) Feb 1 03:25:24 localhost podman[84255]: 2026-02-01 08:25:24.729251977 +0000 UTC m=+0.089135247 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:25:24 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:25:24 localhost podman[84254]: 2026-02-01 08:25:24.777538324 +0000 UTC m=+0.134627868 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.component=openstack-cron-container, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510) Feb 1 03:25:24 localhost podman[84253]: 2026-02-01 08:25:24.83779373 +0000 UTC m=+0.200142766 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, distribution-scope=public) Feb 1 03:25:24 localhost podman[84254]: 2026-02-01 08:25:24.861175731 +0000 UTC m=+0.218265305 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 1 03:25:24 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:25:24 localhost podman[84253]: 2026-02-01 08:25:24.899550942 +0000 UTC m=+0.261899998 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1) Feb 1 03:25:24 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:25:25 localhost systemd[1]: tmp-crun.BDaiZU.mount: Deactivated successfully. Feb 1 03:25:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:25:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:25:29 localhost recover_tripleo_nova_virtqemud[84331]: 61284 Feb 1 03:25:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:25:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:25:29 localhost systemd[1]: tmp-crun.Ua2irJ.mount: Deactivated successfully. Feb 1 03:25:29 localhost podman[84325]: 2026-02-01 08:25:29.738869244 +0000 UTC m=+0.100744174 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z) Feb 1 03:25:30 localhost podman[84325]: 2026-02-01 08:25:30.145435017 +0000 UTC m=+0.507309917 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:25:30 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:25:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:25:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:25:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:25:33 localhost systemd[1]: tmp-crun.hReghs.mount: Deactivated successfully. Feb 1 03:25:33 localhost podman[84351]: 2026-02-01 08:25:33.737900514 +0000 UTC m=+0.095356788 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:25:33 localhost systemd[1]: tmp-crun.7JvfGx.mount: Deactivated successfully. Feb 1 03:25:33 localhost podman[84350]: 2026-02-01 08:25:33.796878791 +0000 UTC m=+0.154741938 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13) Feb 1 03:25:33 localhost podman[84351]: 2026-02-01 08:25:33.821556541 +0000 UTC m=+0.179012845 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent) Feb 1 03:25:33 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:25:33 localhost podman[84352]: 2026-02-01 08:25:33.887054549 +0000 UTC m=+0.238167487 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:25:33 localhost podman[84352]: 2026-02-01 08:25:33.938536984 +0000 UTC m=+0.289649862 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T22:36:40Z, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:25:33 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:25:34 localhost podman[84350]: 2026-02-01 08:25:34.025451921 +0000 UTC m=+0.383315088 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:25:34 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:25:35 localhost sshd[84428]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:25:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:25:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:25:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:25:47 localhost podman[84476]: 2026-02-01 08:25:47.699288468 +0000 UTC m=+0.064001896 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:25:47 localhost systemd[1]: tmp-crun.ZqA541.mount: Deactivated successfully. Feb 1 03:25:47 localhost podman[84476]: 2026-02-01 08:25:47.756875827 +0000 UTC m=+0.121589305 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 1 03:25:47 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:25:47 localhost podman[84475]: 2026-02-01 08:25:47.760651662 +0000 UTC m=+0.125036810 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, release=1766032510, container_name=collectd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:25:47 localhost podman[84477]: 2026-02-01 08:25:47.81526404 +0000 UTC m=+0.172683754 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, version=17.1.13, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 1 03:25:47 localhost podman[84477]: 2026-02-01 08:25:47.826380341 +0000 UTC m=+0.183799965 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, container_name=iscsid) Feb 1 03:25:47 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:25:47 localhost podman[84475]: 2026-02-01 08:25:47.844525128 +0000 UTC m=+0.208910296 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:25:47 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:25:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:25:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:25:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:25:55 localhost podman[84540]: 2026-02-01 08:25:55.777171109 +0000 UTC m=+0.128193288 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:25:55 localhost podman[84540]: 2026-02-01 08:25:55.82475547 +0000 UTC m=+0.175777709 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:25:55 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:25:55 localhost podman[84538]: 2026-02-01 08:25:55.829363491 +0000 UTC m=+0.185507627 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:25:55 localhost podman[84539]: 2026-02-01 08:25:55.889549699 +0000 UTC m=+0.243687183 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 1 03:25:55 localhost podman[84539]: 2026-02-01 08:25:55.908358047 +0000 UTC m=+0.262495591 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, architecture=x86_64, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Feb 1 03:25:55 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:25:55 localhost podman[84538]: 2026-02-01 08:25:55.965415398 +0000 UTC m=+0.321559494 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container) Feb 1 03:25:55 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:26:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:26:00 localhost podman[84610]: 2026-02-01 08:26:00.727911285 +0000 UTC m=+0.087015753 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com) Feb 1 03:26:01 localhost podman[84610]: 2026-02-01 08:26:01.127581667 +0000 UTC m=+0.486686185 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1) Feb 1 03:26:01 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:26:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:26:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:26:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:26:04 localhost systemd[1]: tmp-crun.iMJYL5.mount: Deactivated successfully. Feb 1 03:26:04 localhost podman[84635]: 2026-02-01 08:26:04.725750825 +0000 UTC m=+0.080442572 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Feb 1 03:26:04 localhost systemd[1]: tmp-crun.OutZl2.mount: Deactivated successfully. Feb 1 03:26:04 localhost podman[84636]: 2026-02-01 08:26:04.774400288 +0000 UTC m=+0.123026358 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:26:04 localhost podman[84635]: 2026-02-01 08:26:04.797460177 +0000 UTC m=+0.152151954 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:26:04 localhost podman[84634]: 2026-02-01 08:26:04.754835397 +0000 UTC m=+0.109735940 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:26:04 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:26:04 localhost podman[84636]: 2026-02-01 08:26:04.855439296 +0000 UTC m=+0.204065466 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:26:04 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:26:04 localhost podman[84634]: 2026-02-01 08:26:04.969443697 +0000 UTC m=+0.324344240 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, tcib_managed=true) Feb 1 03:26:04 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:26:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:26:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:26:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:26:18 localhost podman[84788]: 2026-02-01 08:26:18.748452978 +0000 UTC m=+0.101709805 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 1 03:26:18 localhost podman[84788]: 2026-02-01 08:26:18.782959177 +0000 UTC m=+0.136216014 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:26:18 localhost systemd[1]: tmp-crun.T6Zz0i.mount: Deactivated successfully. Feb 1 03:26:18 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:26:18 localhost podman[84787]: 2026-02-01 08:26:18.799124563 +0000 UTC m=+0.152670799 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, release=1766032510, architecture=x86_64, container_name=collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:26:18 localhost podman[84787]: 2026-02-01 08:26:18.834255422 +0000 UTC m=+0.187801638 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, com.redhat.component=openstack-collectd-container) Feb 1 03:26:18 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:26:18 localhost podman[84789]: 2026-02-01 08:26:18.837621355 +0000 UTC m=+0.186019752 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:26:18 localhost podman[84789]: 2026-02-01 08:26:18.918194329 +0000 UTC m=+0.266592736 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:26:18 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:26:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:26:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:26:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:26:26 localhost podman[84853]: 2026-02-01 08:26:26.713699782 +0000 UTC m=+0.072397283 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible) Feb 1 03:26:26 localhost podman[84852]: 2026-02-01 08:26:26.77417442 +0000 UTC m=+0.135802141 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:26:26 localhost podman[84852]: 2026-02-01 08:26:26.782287818 +0000 UTC m=+0.143915539 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, distribution-scope=public, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z) Feb 1 03:26:26 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:26:26 localhost podman[84851]: 2026-02-01 08:26:26.829645772 +0000 UTC m=+0.191257873 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public) Feb 1 03:26:26 localhost podman[84853]: 2026-02-01 08:26:26.85075341 +0000 UTC m=+0.209451001 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:26:26 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:26:26 localhost podman[84851]: 2026-02-01 08:26:26.890530131 +0000 UTC m=+0.252142252 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ceilometer_agent_compute) Feb 1 03:26:26 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:26:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:26:31 localhost podman[84920]: 2026-02-01 08:26:31.707541349 +0000 UTC m=+0.068393722 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z) Feb 1 03:26:32 localhost podman[84920]: 2026-02-01 08:26:32.06355943 +0000 UTC m=+0.424411813 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1) Feb 1 03:26:32 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:26:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:26:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:26:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:26:35 localhost systemd[1]: tmp-crun.ARiL67.mount: Deactivated successfully. Feb 1 03:26:35 localhost podman[84944]: 2026-02-01 08:26:35.715156076 +0000 UTC m=+0.079533253 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:26:35 localhost podman[84945]: 2026-02-01 08:26:35.777415857 +0000 UTC m=+0.135973465 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:26:35 localhost podman[84951]: 2026-02-01 08:26:35.817914541 +0000 UTC m=+0.171820456 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 1 03:26:35 localhost podman[84951]: 2026-02-01 08:26:35.840661549 +0000 UTC m=+0.194567274 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.expose-services=) Feb 1 03:26:35 localhost podman[84945]: 2026-02-01 08:26:35.847283432 +0000 UTC m=+0.205841050 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 1 03:26:35 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:26:35 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:26:35 localhost podman[84944]: 2026-02-01 08:26:35.936499792 +0000 UTC m=+0.300876909 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:26:35 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:26:49 localhost sshd[85068]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:26:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:26:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:26:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:26:49 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:26:49 localhost recover_tripleo_nova_virtqemud[85089]: 61284 Feb 1 03:26:49 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:26:49 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:26:49 localhost systemd[1]: tmp-crun.3WY1EN.mount: Deactivated successfully. Feb 1 03:26:49 localhost podman[85073]: 2026-02-01 08:26:49.745239903 +0000 UTC m=+0.094758160 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container) Feb 1 03:26:49 localhost podman[85073]: 2026-02-01 08:26:49.783938741 +0000 UTC m=+0.133456998 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vcs-type=git, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, vendor=Red Hat, Inc.) Feb 1 03:26:49 localhost systemd[1]: tmp-crun.qr0JrN.mount: Deactivated successfully. Feb 1 03:26:49 localhost podman[85070]: 2026-02-01 08:26:49.795194647 +0000 UTC m=+0.153536005 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510) Feb 1 03:26:49 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:26:49 localhost podman[85070]: 2026-02-01 08:26:49.805289367 +0000 UTC m=+0.163630715 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:26:49 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:26:49 localhost podman[85071]: 2026-02-01 08:26:49.90471759 +0000 UTC m=+0.254463984 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1766032510, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 1 03:26:49 localhost podman[85071]: 2026-02-01 08:26:49.940332843 +0000 UTC m=+0.290079177 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 1 03:26:49 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:26:51 localhost sshd[85140]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:26:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:26:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:26:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:26:57 localhost podman[85141]: 2026-02-01 08:26:57.736857395 +0000 UTC m=+0.091766618 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 1 03:26:57 localhost systemd[1]: tmp-crun.WW6yab.mount: Deactivated successfully. Feb 1 03:26:57 localhost podman[85143]: 2026-02-01 08:26:57.790970167 +0000 UTC m=+0.139305808 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true) Feb 1 03:26:57 localhost podman[85141]: 2026-02-01 08:26:57.813736477 +0000 UTC m=+0.168645760 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5) Feb 1 03:26:57 localhost podman[85142]: 2026-02-01 08:26:57.829542791 +0000 UTC m=+0.179969116 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, vcs-type=git, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 1 03:26:57 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:26:57 localhost podman[85143]: 2026-02-01 08:26:57.838759005 +0000 UTC m=+0.187094696 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510) Feb 1 03:26:57 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:26:57 localhost podman[85142]: 2026-02-01 08:26:57.863407591 +0000 UTC m=+0.213833926 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:26:57 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:26:58 localhost systemd[1]: tmp-crun.lDi4pg.mount: Deactivated successfully. Feb 1 03:27:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:27:02 localhost podman[85212]: 2026-02-01 08:27:02.718438317 +0000 UTC m=+0.083915648 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Feb 1 03:27:03 localhost podman[85212]: 2026-02-01 08:27:03.147546033 +0000 UTC m=+0.513023304 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:27:03 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:27:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:27:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:27:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:27:06 localhost podman[85235]: 2026-02-01 08:27:06.728216425 +0000 UTC m=+0.089899762 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc.) Feb 1 03:27:06 localhost systemd[1]: tmp-crun.NOC5fJ.mount: Deactivated successfully. Feb 1 03:27:06 localhost podman[85242]: 2026-02-01 08:27:06.780579652 +0000 UTC m=+0.129412675 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, distribution-scope=public) Feb 1 03:27:06 localhost podman[85236]: 2026-02-01 08:27:06.753753688 +0000 UTC m=+0.107913874 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Feb 1 03:27:06 localhost podman[85242]: 2026-02-01 08:27:06.836878541 +0000 UTC m=+0.185711584 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1) Feb 1 03:27:06 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:27:06 localhost podman[85236]: 2026-02-01 08:27:06.884026358 +0000 UTC m=+0.238186614 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Feb 1 03:27:06 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:27:06 localhost podman[85235]: 2026-02-01 08:27:06.968416959 +0000 UTC m=+0.330100286 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:27:06 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:27:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:27:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:27:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:27:20 localhost podman[85390]: 2026-02-01 08:27:20.75114912 +0000 UTC m=+0.099384092 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, container_name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13) Feb 1 03:27:20 localhost podman[85389]: 2026-02-01 08:27:20.80324405 +0000 UTC m=+0.154332989 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 1 03:27:20 localhost podman[85390]: 2026-02-01 08:27:20.815684642 +0000 UTC m=+0.163919654 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container) Feb 1 03:27:20 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:27:20 localhost podman[85388]: 2026-02-01 08:27:20.899246877 +0000 UTC m=+0.252891395 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:27:20 localhost podman[85389]: 2026-02-01 08:27:20.909773061 +0000 UTC m=+0.260861910 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 1 03:27:20 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:27:20 localhost podman[85388]: 2026-02-01 08:27:20.934116809 +0000 UTC m=+0.287761307 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, vcs-type=git, container_name=collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:27:20 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:27:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:27:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:27:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:27:28 localhost systemd[1]: tmp-crun.KTQuXE.mount: Deactivated successfully. Feb 1 03:27:28 localhost podman[85451]: 2026-02-01 08:27:28.742065933 +0000 UTC m=+0.086012182 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:27:28 localhost podman[85451]: 2026-02-01 08:27:28.750378718 +0000 UTC m=+0.094324967 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5) Feb 1 03:27:28 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:27:28 localhost podman[85450]: 2026-02-01 08:27:28.799588119 +0000 UTC m=+0.143178297 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.expose-services=) Feb 1 03:27:28 localhost podman[85450]: 2026-02-01 08:27:28.832297633 +0000 UTC m=+0.175887821 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step4, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 1 03:27:28 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:27:28 localhost podman[85452]: 2026-02-01 08:27:28.850898575 +0000 UTC m=+0.188240461 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:27:28 localhost podman[85452]: 2026-02-01 08:27:28.910560416 +0000 UTC m=+0.247902292 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13) Feb 1 03:27:28 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:27:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:27:33 localhost systemd[1]: tmp-crun.qC4jbv.mount: Deactivated successfully. Feb 1 03:27:33 localhost podman[85523]: 2026-02-01 08:27:33.733303414 +0000 UTC m=+0.093611216 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vcs-type=git, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:27:34 localhost podman[85523]: 2026-02-01 08:27:34.104092258 +0000 UTC m=+0.464399990 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true) Feb 1 03:27:34 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:27:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:27:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:27:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:27:37 localhost podman[85549]: 2026-02-01 08:27:37.731808443 +0000 UTC m=+0.080065350 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13) Feb 1 03:27:37 localhost podman[85548]: 2026-02-01 08:27:37.707167266 +0000 UTC m=+0.063543243 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, container_name=ovn_metadata_agent, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:27:37 localhost podman[85547]: 2026-02-01 08:27:37.763684111 +0000 UTC m=+0.121952085 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step1, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team) Feb 1 03:27:37 localhost podman[85548]: 2026-02-01 08:27:37.793307141 +0000 UTC m=+0.149683068 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:27:37 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:27:37 localhost podman[85549]: 2026-02-01 08:27:37.815832743 +0000 UTC m=+0.164089670 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, version=17.1.13, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:27:37 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:27:37 localhost podman[85547]: 2026-02-01 08:27:37.946631569 +0000 UTC m=+0.304899583 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=metrics_qdr, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, version=17.1.13, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:27:37 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:27:38 localhost systemd[1]: tmp-crun.NRETs8.mount: Deactivated successfully. Feb 1 03:27:50 localhost sshd[85666]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:27:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:27:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:27:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:27:51 localhost podman[85669]: 2026-02-01 08:27:51.4234464 +0000 UTC m=+0.091697677 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, architecture=x86_64, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public) Feb 1 03:27:51 localhost podman[85670]: 2026-02-01 08:27:51.471763216 +0000 UTC m=+0.138679722 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team) Feb 1 03:27:51 localhost podman[85669]: 2026-02-01 08:27:51.478348659 +0000 UTC m=+0.146599886 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.expose-services=, container_name=nova_compute, vendor=Red Hat, Inc., release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:27:51 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:27:51 localhost podman[85670]: 2026-02-01 08:27:51.504119106 +0000 UTC m=+0.171035672 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, release=1766032510, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64) Feb 1 03:27:51 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:27:51 localhost podman[85668]: 2026-02-01 08:27:51.531011508 +0000 UTC m=+0.203553258 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=collectd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:27:51 localhost podman[85668]: 2026-02-01 08:27:51.541827643 +0000 UTC m=+0.214369473 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Feb 1 03:27:51 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:27:52 localhost systemd[1]: tmp-crun.QRf6aG.mount: Deactivated successfully. Feb 1 03:27:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:27:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:27:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:27:59 localhost podman[85734]: 2026-02-01 08:27:59.739900372 +0000 UTC m=+0.085173486 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, distribution-scope=public) Feb 1 03:27:59 localhost podman[85734]: 2026-02-01 08:27:59.796628676 +0000 UTC m=+0.141901790 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:30Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:27:59 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:27:59 localhost podman[85732]: 2026-02-01 08:27:59.846215261 +0000 UTC m=+0.196243432 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:27:59 localhost podman[85733]: 2026-02-01 08:27:59.800133065 +0000 UTC m=+0.148692911 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, url=https://www.redhat.com) Feb 1 03:27:59 localhost podman[85733]: 2026-02-01 08:27:59.879060637 +0000 UTC m=+0.227620523 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git) Feb 1 03:27:59 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:27:59 localhost podman[85732]: 2026-02-01 08:27:59.903208003 +0000 UTC m=+0.253236104 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1766032510, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 1 03:27:59 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:28:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:28:04 localhost systemd[1]: tmp-crun.ENFELf.mount: Deactivated successfully. Feb 1 03:28:04 localhost podman[85803]: 2026-02-01 08:28:04.740321032 +0000 UTC m=+0.098824968 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:28:05 localhost podman[85803]: 2026-02-01 08:28:05.13333974 +0000 UTC m=+0.491843686 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, version=17.1.13, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:28:05 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:28:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:28:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:28:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:28:08 localhost podman[85826]: 2026-02-01 08:28:08.730976973 +0000 UTC m=+0.085911949 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:28:08 localhost systemd[1]: tmp-crun.tlmLsU.mount: Deactivated successfully. Feb 1 03:28:08 localhost podman[85828]: 2026-02-01 08:28:08.864548935 +0000 UTC m=+0.210642607 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, distribution-scope=public, container_name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:28:08 localhost podman[85827]: 2026-02-01 08:28:08.830349557 +0000 UTC m=+0.182140566 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:28:08 localhost podman[85828]: 2026-02-01 08:28:08.886816394 +0000 UTC m=+0.232910006 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:28:08 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:28:08 localhost podman[85827]: 2026-02-01 08:28:08.909832515 +0000 UTC m=+0.261623564 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 1 03:28:08 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:28:08 localhost podman[85826]: 2026-02-01 08:28:08.984616949 +0000 UTC m=+0.339551895 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 1 03:28:08 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:28:09 localhost systemd[1]: tmp-crun.aRr7y3.mount: Deactivated successfully. Feb 1 03:28:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:28:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:28:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:28:21 localhost systemd[1]: tmp-crun.utKKA6.mount: Deactivated successfully. Feb 1 03:28:21 localhost podman[85978]: 2026-02-01 08:28:21.75069267 +0000 UTC m=+0.103135632 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible) Feb 1 03:28:21 localhost systemd[1]: tmp-crun.RsX8xu.mount: Deactivated successfully. Feb 1 03:28:21 localhost podman[85978]: 2026-02-01 08:28:21.819394805 +0000 UTC m=+0.171837797 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Feb 1 03:28:21 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:28:21 localhost podman[85980]: 2026-02-01 08:28:21.847264738 +0000 UTC m=+0.193243230 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, release=1766032510) Feb 1 03:28:21 localhost podman[85979]: 2026-02-01 08:28:21.809564481 +0000 UTC m=+0.158598917 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 1 03:28:21 localhost podman[85980]: 2026-02-01 08:28:21.880808125 +0000 UTC m=+0.226786617 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, version=17.1.13, io.buildah.version=1.41.5) Feb 1 03:28:21 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:28:21 localhost podman[85979]: 2026-02-01 08:28:21.895619903 +0000 UTC m=+0.244654349 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.expose-services=, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, managed_by=tripleo_ansible, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:28:21 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:28:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:28:29 localhost recover_tripleo_nova_virtqemud[86044]: 61284 Feb 1 03:28:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:28:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:28:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:28:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:28:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:28:30 localhost systemd[1]: tmp-crun.isgdmS.mount: Deactivated successfully. Feb 1 03:28:30 localhost podman[86045]: 2026-02-01 08:28:30.72207557 +0000 UTC m=+0.082676989 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5) Feb 1 03:28:30 localhost podman[86047]: 2026-02-01 08:28:30.733224415 +0000 UTC m=+0.091082959 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 1 03:28:30 localhost podman[86046]: 2026-02-01 08:28:30.707029394 +0000 UTC m=+0.069509801 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vcs-type=git) Feb 1 03:28:30 localhost podman[86045]: 2026-02-01 08:28:30.773407718 +0000 UTC m=+0.134009157 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510) Feb 1 03:28:30 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:28:30 localhost podman[86046]: 2026-02-01 08:28:30.78639206 +0000 UTC m=+0.148872477 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git) Feb 1 03:28:30 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:28:30 localhost podman[86047]: 2026-02-01 08:28:30.811041502 +0000 UTC m=+0.168900136 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi) Feb 1 03:28:30 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:28:31 localhost systemd[1]: tmp-crun.4VUZZW.mount: Deactivated successfully. Feb 1 03:28:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:28:35 localhost podman[86116]: 2026-02-01 08:28:35.7231928 +0000 UTC m=+0.085682072 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:28:36 localhost podman[86116]: 2026-02-01 08:28:36.096727315 +0000 UTC m=+0.459216587 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4) Feb 1 03:28:36 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:28:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:28:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:28:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:28:39 localhost systemd[1]: tmp-crun.5qikCh.mount: Deactivated successfully. Feb 1 03:28:39 localhost podman[86137]: 2026-02-01 08:28:39.760124016 +0000 UTC m=+0.112460740 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=) Feb 1 03:28:39 localhost podman[86138]: 2026-02-01 08:28:39.789261107 +0000 UTC m=+0.138331940 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:28:39 localhost podman[86138]: 2026-02-01 08:28:39.846037264 +0000 UTC m=+0.195108087 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:56:19Z, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:28:39 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:28:39 localhost podman[86139]: 2026-02-01 08:28:39.850659806 +0000 UTC m=+0.196278713 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:28:39 localhost podman[86139]: 2026-02-01 08:28:39.93028294 +0000 UTC m=+0.275901877 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 1 03:28:39 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:28:39 localhost podman[86137]: 2026-02-01 08:28:39.969744901 +0000 UTC m=+0.322081635 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:28:39 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:28:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:28:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:28:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:28:52 localhost podman[86259]: 2026-02-01 08:28:52.740542227 +0000 UTC m=+0.094923616 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step5) Feb 1 03:28:52 localhost podman[86259]: 2026-02-01 08:28:52.776379686 +0000 UTC m=+0.130761065 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, tcib_managed=true, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:28:52 localhost podman[86258]: 2026-02-01 08:28:52.788569494 +0000 UTC m=+0.148560677 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, container_name=collectd) Feb 1 03:28:52 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:28:52 localhost podman[86258]: 2026-02-01 08:28:52.832571984 +0000 UTC m=+0.192563177 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:28:52 localhost systemd[1]: tmp-crun.CL7nqT.mount: Deactivated successfully. Feb 1 03:28:52 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:28:52 localhost podman[86260]: 2026-02-01 08:28:52.848786916 +0000 UTC m=+0.202490455 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=) Feb 1 03:28:52 localhost podman[86260]: 2026-02-01 08:28:52.859739335 +0000 UTC m=+0.213442914 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64) Feb 1 03:28:52 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:29:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:29:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:29:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:29:01 localhost systemd[1]: tmp-crun.Z42HnR.mount: Deactivated successfully. Feb 1 03:29:01 localhost systemd[1]: tmp-crun.TSihOs.mount: Deactivated successfully. Feb 1 03:29:01 localhost podman[86325]: 2026-02-01 08:29:01.777947771 +0000 UTC m=+0.132845470 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4) Feb 1 03:29:01 localhost podman[86327]: 2026-02-01 08:29:01.843134278 +0000 UTC m=+0.192054963 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1) Feb 1 03:29:01 localhost podman[86326]: 2026-02-01 08:29:01.809097375 +0000 UTC m=+0.161449875 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, tcib_managed=true, com.redhat.component=openstack-cron-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13) Feb 1 03:29:01 localhost podman[86326]: 2026-02-01 08:29:01.890686259 +0000 UTC m=+0.243038709 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:29:01 localhost podman[86327]: 2026-02-01 08:29:01.901287216 +0000 UTC m=+0.250207881 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:29:01 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:29:01 localhost podman[86325]: 2026-02-01 08:29:01.913173625 +0000 UTC m=+0.268071294 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc.) Feb 1 03:29:01 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:29:01 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:29:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:29:06 localhost systemd[1]: tmp-crun.F6wBZt.mount: Deactivated successfully. Feb 1 03:29:06 localhost podman[86397]: 2026-02-01 08:29:06.705357262 +0000 UTC m=+0.068989574 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-type=git) Feb 1 03:29:08 localhost podman[86397]: 2026-02-01 08:29:08.227357797 +0000 UTC m=+1.590990119 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:29:08 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:29:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:29:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:29:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:29:10 localhost podman[86420]: 2026-02-01 08:29:10.755684551 +0000 UTC m=+0.090344755 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z) Feb 1 03:29:10 localhost podman[86422]: 2026-02-01 08:29:10.809179886 +0000 UTC m=+0.135535693 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:29:10 localhost podman[86422]: 2026-02-01 08:29:10.868363187 +0000 UTC m=+0.194718994 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:29:10 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:29:10 localhost podman[86421]: 2026-02-01 08:29:10.87233559 +0000 UTC m=+0.202004059 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:29:10 localhost podman[86421]: 2026-02-01 08:29:10.95636698 +0000 UTC m=+0.286035379 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=) Feb 1 03:29:10 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:29:10 localhost podman[86420]: 2026-02-01 08:29:10.978454193 +0000 UTC m=+0.313114367 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, url=https://www.redhat.com) Feb 1 03:29:11 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:29:11 localhost systemd[1]: tmp-crun.39hkGt.mount: Deactivated successfully. Feb 1 03:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:29:23 localhost systemd[83465]: Created slice User Background Tasks Slice. Feb 1 03:29:23 localhost systemd[83465]: Starting Cleanup of User's Temporary Files and Directories... Feb 1 03:29:23 localhost podman[86572]: 2026-02-01 08:29:23.740995346 +0000 UTC m=+0.095450955 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z) Feb 1 03:29:23 localhost systemd[83465]: Finished Cleanup of User's Temporary Files and Directories. Feb 1 03:29:23 localhost podman[86572]: 2026-02-01 08:29:23.750903232 +0000 UTC m=+0.105358841 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, container_name=collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z) Feb 1 03:29:23 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:29:23 localhost podman[86573]: 2026-02-01 08:29:23.843824246 +0000 UTC m=+0.197743838 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible) Feb 1 03:29:23 localhost podman[86574]: 2026-02-01 08:29:23.896903048 +0000 UTC m=+0.249067186 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3) Feb 1 03:29:23 localhost podman[86574]: 2026-02-01 08:29:23.904989128 +0000 UTC m=+0.257153286 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true) Feb 1 03:29:23 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:29:23 localhost podman[86573]: 2026-02-01 08:29:23.950848607 +0000 UTC m=+0.304768169 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public) Feb 1 03:29:23 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:29:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:29:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:29:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:29:32 localhost systemd[1]: tmp-crun.8DG25w.mount: Deactivated successfully. Feb 1 03:29:32 localhost podman[86640]: 2026-02-01 08:29:32.790207197 +0000 UTC m=+0.139947990 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, release=1766032510) Feb 1 03:29:32 localhost podman[86641]: 2026-02-01 08:29:32.845797037 +0000 UTC m=+0.193059273 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Feb 1 03:29:32 localhost podman[86639]: 2026-02-01 08:29:32.759943301 +0000 UTC m=+0.116681481 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.buildah.version=1.41.5) Feb 1 03:29:32 localhost podman[86640]: 2026-02-01 08:29:32.872924766 +0000 UTC m=+0.222665559 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:29:32 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:29:32 localhost podman[86639]: 2026-02-01 08:29:32.890663865 +0000 UTC m=+0.247402025 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, version=17.1.13, distribution-scope=public, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:29:32 localhost podman[86641]: 2026-02-01 08:29:32.903030497 +0000 UTC m=+0.250292663 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4) Feb 1 03:29:32 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:29:32 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:29:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:29:38 localhost podman[86711]: 2026-02-01 08:29:38.731848274 +0000 UTC m=+0.084654791 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=nova_migration_target, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:29:39 localhost podman[86711]: 2026-02-01 08:29:39.114444299 +0000 UTC m=+0.467250876 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 1 03:29:39 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:29:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:29:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:29:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:29:41 localhost podman[86733]: 2026-02-01 08:29:41.738214738 +0000 UTC m=+0.095523726 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, config_id=tripleo_step1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:29:41 localhost systemd[1]: tmp-crun.p1fG6y.mount: Deactivated successfully. Feb 1 03:29:41 localhost podman[86734]: 2026-02-01 08:29:41.846791397 +0000 UTC m=+0.202574978 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4) Feb 1 03:29:41 localhost podman[86735]: 2026-02-01 08:29:41.804834029 +0000 UTC m=+0.154771449 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., container_name=ovn_controller, batch=17.1_20260112.1) Feb 1 03:29:41 localhost podman[86735]: 2026-02-01 08:29:41.884419621 +0000 UTC m=+0.234357031 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1) Feb 1 03:29:41 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:29:41 localhost podman[86734]: 2026-02-01 08:29:41.918838816 +0000 UTC m=+0.274622387 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Feb 1 03:29:41 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:29:41 localhost podman[86733]: 2026-02-01 08:29:41.977674286 +0000 UTC m=+0.334983254 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public) Feb 1 03:29:41 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:29:42 localhost systemd[1]: tmp-crun.kaR6ho.mount: Deactivated successfully. Feb 1 03:29:48 localhost sshd[86853]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:29:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:29:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:29:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:29:54 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:29:54 localhost recover_tripleo_nova_virtqemud[86862]: 61284 Feb 1 03:29:54 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:29:54 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:29:54 localhost systemd[1]: tmp-crun.qoYe8O.mount: Deactivated successfully. Feb 1 03:29:54 localhost podman[86856]: 2026-02-01 08:29:54.74229519 +0000 UTC m=+0.091244744 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute) Feb 1 03:29:54 localhost podman[86856]: 2026-02-01 08:29:54.766388195 +0000 UTC m=+0.115337749 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:29:54 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:29:54 localhost systemd[1]: tmp-crun.zR4dkM.mount: Deactivated successfully. Feb 1 03:29:54 localhost podman[86855]: 2026-02-01 08:29:54.852171049 +0000 UTC m=+0.202614099 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3) Feb 1 03:29:54 localhost podman[86855]: 2026-02-01 08:29:54.869276628 +0000 UTC m=+0.219719698 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:29:54 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:29:54 localhost podman[86857]: 2026-02-01 08:29:54.952878795 +0000 UTC m=+0.301769887 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:29:54 localhost podman[86857]: 2026-02-01 08:29:54.991291383 +0000 UTC m=+0.340182455 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-iscsid) Feb 1 03:29:55 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:30:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:30:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:30:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:30:03 localhost systemd[1]: tmp-crun.OZbCQ6.mount: Deactivated successfully. Feb 1 03:30:03 localhost podman[86919]: 2026-02-01 08:30:03.740133419 +0000 UTC m=+0.099823892 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, release=1766032510) Feb 1 03:30:03 localhost podman[86919]: 2026-02-01 08:30:03.775339018 +0000 UTC m=+0.135029571 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:30:03 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:30:03 localhost podman[86921]: 2026-02-01 08:30:03.798038834 +0000 UTC m=+0.149883677 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:30:03 localhost podman[86921]: 2026-02-01 08:30:03.83211516 +0000 UTC m=+0.183960013 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc.) Feb 1 03:30:03 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:30:03 localhost podman[86920]: 2026-02-01 08:30:03.853227217 +0000 UTC m=+0.206940817 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container) Feb 1 03:30:03 localhost podman[86920]: 2026-02-01 08:30:03.862895044 +0000 UTC m=+0.216608674 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, release=1766032510, io.buildah.version=1.41.5, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Feb 1 03:30:03 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:30:04 localhost systemd[1]: tmp-crun.lko9RI.mount: Deactivated successfully. Feb 1 03:30:06 localhost sshd[86994]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:30:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:30:09 localhost systemd[1]: tmp-crun.wkF1vh.mount: Deactivated successfully. Feb 1 03:30:09 localhost podman[86996]: 2026-02-01 08:30:09.732264797 +0000 UTC m=+0.100433511 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, tcib_managed=true) Feb 1 03:30:10 localhost podman[86996]: 2026-02-01 08:30:10.110475155 +0000 UTC m=+0.478643899 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:30:10 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:30:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:30:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:30:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:30:12 localhost systemd[1]: tmp-crun.0RnTce.mount: Deactivated successfully. Feb 1 03:30:12 localhost podman[87019]: 2026-02-01 08:30:12.746269052 +0000 UTC m=+0.095419057 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Feb 1 03:30:12 localhost podman[87021]: 2026-02-01 08:30:12.802076553 +0000 UTC m=+0.148518295 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1766032510, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:30:12 localhost podman[87018]: 2026-02-01 08:30:12.714616051 +0000 UTC m=+0.072057070 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, build-date=2026-01-12T22:10:14Z) Feb 1 03:30:12 localhost podman[87021]: 2026-02-01 08:30:12.827557785 +0000 UTC m=+0.173999567 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public) Feb 1 03:30:12 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:30:12 localhost podman[87019]: 2026-02-01 08:30:12.883882262 +0000 UTC m=+0.233032257 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:30:12 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:30:12 localhost podman[87018]: 2026-02-01 08:30:12.951596698 +0000 UTC m=+0.309037717 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, batch=17.1_20260112.1) Feb 1 03:30:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:30:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:30:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:30:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:30:25 localhost podman[87169]: 2026-02-01 08:30:25.73857644 +0000 UTC m=+0.085856933 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, distribution-scope=public, vcs-type=git, container_name=collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13) Feb 1 03:30:25 localhost podman[87171]: 2026-02-01 08:30:25.787739718 +0000 UTC m=+0.129706738 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:30:25 localhost podman[87169]: 2026-02-01 08:30:25.802786759 +0000 UTC m=+0.150067242 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step3) Feb 1 03:30:25 localhost podman[87171]: 2026-02-01 08:30:25.823914518 +0000 UTC m=+0.165881488 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step3, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:34:43Z, release=1766032510, container_name=iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:30:25 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:30:25 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:30:25 localhost podman[87170]: 2026-02-01 08:30:25.870113194 +0000 UTC m=+0.213971932 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, version=17.1.13, release=1766032510) Feb 1 03:30:25 localhost podman[87170]: 2026-02-01 08:30:25.924975927 +0000 UTC m=+0.268834665 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step5, version=17.1.13, distribution-scope=public, release=1766032510, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true) Feb 1 03:30:25 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:30:26 localhost systemd[1]: tmp-crun.tmifFF.mount: Deactivated successfully. Feb 1 03:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:30:34 localhost systemd[1]: tmp-crun.rqXJEM.mount: Deactivated successfully. Feb 1 03:30:34 localhost podman[87235]: 2026-02-01 08:30:34.732894211 +0000 UTC m=+0.088278237 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 1 03:30:34 localhost podman[87235]: 2026-02-01 08:30:34.768579586 +0000 UTC m=+0.123963602 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:30:34 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:30:34 localhost podman[87234]: 2026-02-01 08:30:34.796921295 +0000 UTC m=+0.153396395 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:30:34 localhost podman[87234]: 2026-02-01 08:30:34.864554599 +0000 UTC m=+0.221029749 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z) Feb 1 03:30:34 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:30:34 localhost podman[87236]: 2026-02-01 08:30:34.866936202 +0000 UTC m=+0.216297334 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=) Feb 1 03:30:34 localhost podman[87236]: 2026-02-01 08:30:34.949640497 +0000 UTC m=+0.299001599 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:30:34 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:30:35 localhost systemd[1]: tmp-crun.VqpqwD.mount: Deactivated successfully. Feb 1 03:30:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:30:40 localhost podman[87302]: 2026-02-01 08:30:40.732665325 +0000 UTC m=+0.083515252 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:30:41 localhost podman[87302]: 2026-02-01 08:30:41.081422339 +0000 UTC m=+0.432272266 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:30:41 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:30:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:30:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:30:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:30:43 localhost podman[87347]: 2026-02-01 08:30:43.706526677 +0000 UTC m=+0.065560011 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:30:43 localhost systemd[1]: tmp-crun.Lkilzq.mount: Deactivated successfully. Feb 1 03:30:43 localhost podman[87345]: 2026-02-01 08:30:43.810535008 +0000 UTC m=+0.172102619 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, container_name=metrics_qdr, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., url=https://www.redhat.com) Feb 1 03:30:43 localhost podman[87346]: 2026-02-01 08:30:43.781024562 +0000 UTC m=+0.141815840 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:30:43 localhost podman[87347]: 2026-02-01 08:30:43.841342932 +0000 UTC m=+0.200376236 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4) Feb 1 03:30:43 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:30:43 localhost podman[87346]: 2026-02-01 08:30:43.863513741 +0000 UTC m=+0.224305039 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510) Feb 1 03:30:43 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:30:44 localhost podman[87345]: 2026-02-01 08:30:44.032900796 +0000 UTC m=+0.394468497 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:30:44 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:30:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:30:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 5701 writes, 25K keys, 5701 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5701 writes, 740 syncs, 7.70 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 517 writes, 2082 keys, 517 commit groups, 1.0 writes per commit group, ingest: 2.43 MB, 0.00 MB/s#012Interval WAL: 517 writes, 181 syncs, 2.86 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:30:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:30:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 4896 writes, 22K keys, 4896 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4896 writes, 685 syncs, 7.15 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 456 writes, 1686 keys, 456 commit groups, 1.0 writes per commit group, ingest: 2.15 MB, 0.00 MB/s#012Interval WAL: 456 writes, 186 syncs, 2.45 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:30:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:30:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:30:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:30:56 localhost systemd[1]: tmp-crun.p1zfgm.mount: Deactivated successfully. Feb 1 03:30:56 localhost podman[87445]: 2026-02-01 08:30:56.727085473 +0000 UTC m=+0.080514359 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:30:56 localhost systemd[1]: tmp-crun.8Tt2re.mount: Deactivated successfully. Feb 1 03:30:56 localhost podman[87444]: 2026-02-01 08:30:56.792340445 +0000 UTC m=+0.144023808 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, batch=17.1_20260112.1) Feb 1 03:30:56 localhost podman[87446]: 2026-02-01 08:30:56.759410165 +0000 UTC m=+0.106787996 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 1 03:30:56 localhost podman[87445]: 2026-02-01 08:30:56.811334147 +0000 UTC m=+0.164763033 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 1 03:30:56 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:30:56 localhost podman[87444]: 2026-02-01 08:30:56.824320415 +0000 UTC m=+0.176003778 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 1 03:30:56 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:30:56 localhost podman[87446]: 2026-02-01 08:30:56.888836664 +0000 UTC m=+0.236214425 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:30:56 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:31:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:31:05 localhost systemd[1]: tmp-crun.ttOwJY.mount: Deactivated successfully. Feb 1 03:31:05 localhost podman[87509]: 2026-02-01 08:31:05.738273171 +0000 UTC m=+0.095790359 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron) Feb 1 03:31:05 localhost podman[87509]: 2026-02-01 08:31:05.749547906 +0000 UTC m=+0.107065054 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 1 03:31:05 localhost podman[87508]: 2026-02-01 08:31:05.795144584 +0000 UTC m=+0.152721623 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:31:05 localhost podman[87508]: 2026-02-01 08:31:05.827975041 +0000 UTC m=+0.185552110 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 1 03:31:05 localhost podman[87510]: 2026-02-01 08:31:05.839326959 +0000 UTC m=+0.190935425 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:31:05 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:31:05 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:31:05 localhost podman[87510]: 2026-02-01 08:31:05.893374577 +0000 UTC m=+0.244983033 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:31:05 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:31:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:31:11 localhost systemd[1]: tmp-crun.qXtH2o.mount: Deactivated successfully. Feb 1 03:31:11 localhost podman[87580]: 2026-02-01 08:31:11.740603543 +0000 UTC m=+0.096423118 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:31:12 localhost podman[87580]: 2026-02-01 08:31:12.124485485 +0000 UTC m=+0.480305080 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team) Feb 1 03:31:12 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:31:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:31:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:31:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:31:14 localhost podman[87603]: 2026-02-01 08:31:14.740091382 +0000 UTC m=+0.096613653 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Feb 1 03:31:14 localhost systemd[1]: tmp-crun.ojyABv.mount: Deactivated successfully. Feb 1 03:31:14 localhost podman[87604]: 2026-02-01 08:31:14.810598664 +0000 UTC m=+0.162968198 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:31:14 localhost podman[87605]: 2026-02-01 08:31:14.857550794 +0000 UTC m=+0.208166274 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:31:14 localhost podman[87604]: 2026-02-01 08:31:14.884524351 +0000 UTC m=+0.236893945 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:31:14 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:31:14 localhost podman[87605]: 2026-02-01 08:31:14.904550895 +0000 UTC m=+0.255166435 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:31:14 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:31:14 localhost podman[87603]: 2026-02-01 08:31:14.969382743 +0000 UTC m=+0.325904974 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5) Feb 1 03:31:14 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:31:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:31:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:31:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:31:27 localhost systemd[1]: tmp-crun.VHbMLI.mount: Deactivated successfully. Feb 1 03:31:27 localhost podman[87757]: 2026-02-01 08:31:27.747748631 +0000 UTC m=+0.103256248 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, container_name=collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:31:27 localhost podman[87758]: 2026-02-01 08:31:27.784147567 +0000 UTC m=+0.138263711 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:31:27 localhost podman[87758]: 2026-02-01 08:31:27.820830982 +0000 UTC m=+0.174947096 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute) Feb 1 03:31:27 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:31:27 localhost podman[87757]: 2026-02-01 08:31:27.880082669 +0000 UTC m=+0.235590236 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:31:27 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:31:27 localhost podman[87759]: 2026-02-01 08:31:27.895406868 +0000 UTC m=+0.244995733 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:34:43Z, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 1 03:31:27 localhost podman[87759]: 2026-02-01 08:31:27.937400706 +0000 UTC m=+0.286989551 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:31:27 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:31:36 localhost systemd[1]: tmp-crun.JC0HB1.mount: Deactivated successfully. Feb 1 03:31:36 localhost podman[87821]: 2026-02-01 08:31:36.73450779 +0000 UTC m=+0.093244921 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.) Feb 1 03:31:36 localhost systemd[1]: tmp-crun.cMzWof.mount: Deactivated successfully. Feb 1 03:31:36 localhost podman[87822]: 2026-02-01 08:31:36.76059846 +0000 UTC m=+0.114704038 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:31:36 localhost podman[87821]: 2026-02-01 08:31:36.801627988 +0000 UTC m=+0.160365109 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:31:36 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:31:36 localhost podman[87820]: 2026-02-01 08:31:36.85159812 +0000 UTC m=+0.208560256 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_compute) Feb 1 03:31:36 localhost podman[87822]: 2026-02-01 08:31:36.869981714 +0000 UTC m=+0.224087212 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13) Feb 1 03:31:36 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:31:36 localhost podman[87820]: 2026-02-01 08:31:36.881510318 +0000 UTC m=+0.238472394 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:31:36 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:31:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:31:42 localhost systemd[1]: tmp-crun.VkOYm1.mount: Deactivated successfully. Feb 1 03:31:42 localhost podman[87889]: 2026-02-01 08:31:42.740245264 +0000 UTC m=+0.094958073 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc.) Feb 1 03:31:43 localhost podman[87889]: 2026-02-01 08:31:43.141638454 +0000 UTC m=+0.496351183 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public) Feb 1 03:31:43 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:31:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:31:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:31:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:31:45 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:31:45 localhost recover_tripleo_nova_virtqemud[87975]: 61284 Feb 1 03:31:45 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:31:45 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:31:45 localhost systemd[1]: tmp-crun.FgtXG2.mount: Deactivated successfully. Feb 1 03:31:45 localhost podman[87956]: 2026-02-01 08:31:45.71486102 +0000 UTC m=+0.080640704 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:31:45 localhost podman[87956]: 2026-02-01 08:31:45.765481442 +0000 UTC m=+0.131261086 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:31:45 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:31:45 localhost podman[87955]: 2026-02-01 08:31:45.811896406 +0000 UTC m=+0.175592836 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 1 03:31:45 localhost podman[87957]: 2026-02-01 08:31:45.768376321 +0000 UTC m=+0.130542053 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, container_name=ovn_controller, config_id=tripleo_step4) Feb 1 03:31:45 localhost podman[87957]: 2026-02-01 08:31:45.847439626 +0000 UTC m=+0.209605378 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, release=1766032510) Feb 1 03:31:45 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:31:46 localhost podman[87955]: 2026-02-01 08:31:46.027466797 +0000 UTC m=+0.391163267 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:31:46 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:31:46 localhost systemd[1]: tmp-crun.gskdu2.mount: Deactivated successfully. Feb 1 03:31:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:31:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:31:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:31:58 localhost systemd[1]: tmp-crun.hTD3Sp.mount: Deactivated successfully. Feb 1 03:31:58 localhost podman[88031]: 2026-02-01 08:31:58.746248396 +0000 UTC m=+0.100410180 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:31:58 localhost systemd[1]: tmp-crun.lc57Xd.mount: Deactivated successfully. Feb 1 03:31:58 localhost podman[88030]: 2026-02-01 08:31:58.791573276 +0000 UTC m=+0.146193764 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1) Feb 1 03:31:58 localhost podman[88031]: 2026-02-01 08:31:58.803595044 +0000 UTC m=+0.157756758 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:31:58 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:31:58 localhost podman[88030]: 2026-02-01 08:31:58.854579269 +0000 UTC m=+0.209199777 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible) Feb 1 03:31:58 localhost podman[88032]: 2026-02-01 08:31:58.897746242 +0000 UTC m=+0.249595075 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:31:58 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:31:58 localhost podman[88032]: 2026-02-01 08:31:58.932386074 +0000 UTC m=+0.284234937 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:31:58 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:32:07 localhost podman[88093]: 2026-02-01 08:32:07.718483415 +0000 UTC m=+0.079401846 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:32:07 localhost podman[88093]: 2026-02-01 08:32:07.751440664 +0000 UTC m=+0.112359085 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:32:07 localhost systemd[1]: tmp-crun.3Cafhz.mount: Deactivated successfully. Feb 1 03:32:07 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:32:07 localhost podman[88094]: 2026-02-01 08:32:07.77116304 +0000 UTC m=+0.128663346 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond) Feb 1 03:32:07 localhost podman[88094]: 2026-02-01 08:32:07.80674111 +0000 UTC m=+0.164241406 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1) Feb 1 03:32:07 localhost podman[88095]: 2026-02-01 08:32:07.813633852 +0000 UTC m=+0.168371443 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc.) Feb 1 03:32:07 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:32:07 localhost podman[88095]: 2026-02-01 08:32:07.863579983 +0000 UTC m=+0.218317574 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=) Feb 1 03:32:07 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:32:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:32:13 localhost podman[88163]: 2026-02-01 08:32:13.730360257 +0000 UTC m=+0.084161441 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container) Feb 1 03:32:14 localhost podman[88163]: 2026-02-01 08:32:14.110715239 +0000 UTC m=+0.464516453 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, vcs-type=git, distribution-scope=public, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 1 03:32:14 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:32:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:32:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:32:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:32:16 localhost systemd[1]: tmp-crun.cg5qXe.mount: Deactivated successfully. Feb 1 03:32:16 localhost podman[88187]: 2026-02-01 08:32:16.714063651 +0000 UTC m=+0.071818743 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:32:16 localhost podman[88187]: 2026-02-01 08:32:16.752970724 +0000 UTC m=+0.110725886 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 1 03:32:16 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:32:16 localhost podman[88188]: 2026-02-01 08:32:16.763150026 +0000 UTC m=+0.114058937 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:32:16 localhost podman[88186]: 2026-02-01 08:32:16.838919149 +0000 UTC m=+0.196357901 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1766032510, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 1 03:32:16 localhost podman[88188]: 2026-02-01 08:32:16.9035502 +0000 UTC m=+0.254459171 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, release=1766032510, build-date=2026-01-12T22:36:40Z) Feb 1 03:32:16 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:32:17 localhost podman[88186]: 2026-02-01 08:32:17.056257222 +0000 UTC m=+0.413695994 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:32:17 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:32:22 localhost sshd[88372]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:32:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:32:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:32:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:32:29 localhost podman[88391]: 2026-02-01 08:32:29.740166377 +0000 UTC m=+0.090960791 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, container_name=iscsid, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5) Feb 1 03:32:29 localhost podman[88389]: 2026-02-01 08:32:29.789634063 +0000 UTC m=+0.142857441 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, vcs-type=git, release=1766032510, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com) Feb 1 03:32:29 localhost podman[88389]: 2026-02-01 08:32:29.799499136 +0000 UTC m=+0.152722464 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 1 03:32:29 localhost podman[88391]: 2026-02-01 08:32:29.807317756 +0000 UTC m=+0.158112170 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z) Feb 1 03:32:29 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:32:29 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:32:29 localhost podman[88390]: 2026-02-01 08:32:29.888087631 +0000 UTC m=+0.238859514 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1766032510, batch=17.1_20260112.1) Feb 1 03:32:29 localhost podman[88390]: 2026-02-01 08:32:29.939426575 +0000 UTC m=+0.290198448 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 1 03:32:29 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:32:38 localhost podman[88454]: 2026-02-01 08:32:38.744263207 +0000 UTC m=+0.097186061 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:32:38 localhost podman[88456]: 2026-02-01 08:32:38.790735841 +0000 UTC m=+0.135894117 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:32:38 localhost podman[88455]: 2026-02-01 08:32:38.843624233 +0000 UTC m=+0.193586456 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public) Feb 1 03:32:38 localhost podman[88456]: 2026-02-01 08:32:38.871759175 +0000 UTC m=+0.216917441 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:32:38 localhost podman[88455]: 2026-02-01 08:32:38.877239903 +0000 UTC m=+0.227202106 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:32:38 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:32:38 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:32:38 localhost podman[88454]: 2026-02-01 08:32:38.923477161 +0000 UTC m=+0.276400005 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:32:38 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:32:44 localhost podman[88523]: 2026-02-01 08:32:44.721019484 +0000 UTC m=+0.082693876 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:32:45 localhost podman[88523]: 2026-02-01 08:32:45.089327916 +0000 UTC m=+0.451002278 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5) Feb 1 03:32:45 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:32:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:32:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:32:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:32:47 localhost systemd[1]: tmp-crun.P5LTIG.mount: Deactivated successfully. Feb 1 03:32:47 localhost systemd[1]: tmp-crun.N0FN3f.mount: Deactivated successfully. Feb 1 03:32:47 localhost podman[88570]: 2026-02-01 08:32:47.804106886 +0000 UTC m=+0.160753399 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container) Feb 1 03:32:47 localhost podman[88572]: 2026-02-01 08:32:47.810508002 +0000 UTC m=+0.160340777 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:32:47 localhost podman[88571]: 2026-02-01 08:32:47.773138086 +0000 UTC m=+0.124772546 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.buildah.version=1.41.5) Feb 1 03:32:47 localhost podman[88572]: 2026-02-01 08:32:47.833886049 +0000 UTC m=+0.183718804 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510) Feb 1 03:32:47 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:32:47 localhost podman[88571]: 2026-02-01 08:32:47.858518924 +0000 UTC m=+0.210153354 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, container_name=ovn_metadata_agent) Feb 1 03:32:47 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:32:48 localhost podman[88570]: 2026-02-01 08:32:48.020785838 +0000 UTC m=+0.377432331 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, version=17.1.13, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 1 03:32:48 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:32:48 localhost sshd[88647]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:32:59 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:32:59 localhost recover_tripleo_nova_virtqemud[88650]: 61284 Feb 1 03:32:59 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:32:59 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:33:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:33:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:33:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:33:00 localhost systemd[1]: tmp-crun.EMQdBB.mount: Deactivated successfully. Feb 1 03:33:00 localhost podman[88651]: 2026-02-01 08:33:00.728479681 +0000 UTC m=+0.093464716 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 1 03:33:00 localhost systemd[1]: tmp-crun.sixLLl.mount: Deactivated successfully. Feb 1 03:33:00 localhost podman[88652]: 2026-02-01 08:33:00.765769575 +0000 UTC m=+0.127705546 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:33:00 localhost podman[88653]: 2026-02-01 08:33:00.744434571 +0000 UTC m=+0.103112433 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5) Feb 1 03:33:00 localhost podman[88651]: 2026-02-01 08:33:00.822515655 +0000 UTC m=+0.187500740 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, build-date=2026-01-12T22:10:15Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5) Feb 1 03:33:00 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:33:00 localhost podman[88652]: 2026-02-01 08:33:00.874463988 +0000 UTC m=+0.236399979 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, release=1766032510, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Feb 1 03:33:00 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:33:00 localhost podman[88653]: 2026-02-01 08:33:00.92869776 +0000 UTC m=+0.287375712 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:33:00 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:33:09 localhost systemd[1]: tmp-crun.FytgQi.mount: Deactivated successfully. Feb 1 03:33:09 localhost podman[88715]: 2026-02-01 08:33:09.728656701 +0000 UTC m=+0.075019031 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:10:15Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:33:09 localhost podman[88716]: 2026-02-01 08:33:09.740842255 +0000 UTC m=+0.085471552 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 1 03:33:09 localhost podman[88715]: 2026-02-01 08:33:09.771079242 +0000 UTC m=+0.117441662 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-cron, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:33:09 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:33:09 localhost podman[88714]: 2026-02-01 08:33:09.792394376 +0000 UTC m=+0.144288606 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:33:09 localhost podman[88716]: 2026-02-01 08:33:09.793428147 +0000 UTC m=+0.138057474 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, version=17.1.13, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:33:09 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:33:09 localhost podman[88714]: 2026-02-01 08:33:09.872163751 +0000 UTC m=+0.224058021 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 1 03:33:09 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:33:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:33:15 localhost systemd[1]: tmp-crun.qEuYA9.mount: Deactivated successfully. Feb 1 03:33:15 localhost podman[88786]: 2026-02-01 08:33:15.730257301 +0000 UTC m=+0.090139245 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, container_name=nova_migration_target, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 1 03:33:16 localhost podman[88786]: 2026-02-01 08:33:16.105440913 +0000 UTC m=+0.465322877 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:33:16 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:33:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:33:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:33:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:33:18 localhost podman[88811]: 2026-02-01 08:33:18.739745487 +0000 UTC m=+0.091573089 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:33:18 localhost podman[88812]: 2026-02-01 08:33:18.790103211 +0000 UTC m=+0.136549308 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1766032510, version=17.1.13, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc.) Feb 1 03:33:18 localhost podman[88810]: 2026-02-01 08:33:18.846920292 +0000 UTC m=+0.198387383 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, architecture=x86_64, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 1 03:33:18 localhost podman[88811]: 2026-02-01 08:33:18.860408596 +0000 UTC m=+0.212236208 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4) Feb 1 03:33:18 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:33:18 localhost podman[88812]: 2026-02-01 08:33:18.911874503 +0000 UTC m=+0.258320550 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible) Feb 1 03:33:18 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:33:19 localhost podman[88810]: 2026-02-01 08:33:19.044423758 +0000 UTC m=+0.395890789 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z) Feb 1 03:33:19 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:33:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:33:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:33:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:33:31 localhost systemd[1]: tmp-crun.rjnGxi.mount: Deactivated successfully. Feb 1 03:33:31 localhost podman[88963]: 2026-02-01 08:33:31.846230825 +0000 UTC m=+0.193908396 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, config_id=tripleo_step3) Feb 1 03:33:31 localhost podman[88963]: 2026-02-01 08:33:31.859319077 +0000 UTC m=+0.206996618 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public) Feb 1 03:33:31 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:33:31 localhost podman[88962]: 2026-02-01 08:33:31.759287519 +0000 UTC m=+0.110414515 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, container_name=nova_compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team) Feb 1 03:33:31 localhost podman[88962]: 2026-02-01 08:33:31.944645493 +0000 UTC m=+0.295772459 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=) Feb 1 03:33:31 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:33:31 localhost podman[88961]: 2026-02-01 08:33:31.811714997 +0000 UTC m=+0.165172125 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, config_id=tripleo_step3, container_name=collectd) Feb 1 03:33:31 localhost podman[88961]: 2026-02-01 08:33:31.998635318 +0000 UTC m=+0.352092486 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, release=1766032510) Feb 1 03:33:32 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:33:32 localhost systemd[1]: tmp-crun.yNwrUz.mount: Deactivated successfully. Feb 1 03:33:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:33:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:33:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:33:40 localhost systemd[1]: tmp-crun.WgF5fh.mount: Deactivated successfully. Feb 1 03:33:40 localhost podman[89026]: 2026-02-01 08:33:40.718261544 +0000 UTC m=+0.074684000 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:33:40 localhost podman[89027]: 2026-02-01 08:33:40.776288794 +0000 UTC m=+0.128500241 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, distribution-scope=public) Feb 1 03:33:40 localhost podman[89026]: 2026-02-01 08:33:40.804511459 +0000 UTC m=+0.160933955 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, distribution-scope=public) Feb 1 03:33:40 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:33:40 localhost podman[89028]: 2026-02-01 08:33:40.755453175 +0000 UTC m=+0.101768661 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:33:40 localhost podman[89027]: 2026-02-01 08:33:40.858758193 +0000 UTC m=+0.210969640 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, distribution-scope=public) Feb 1 03:33:40 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:33:40 localhost podman[89028]: 2026-02-01 08:33:40.888526785 +0000 UTC m=+0.234842291 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5) Feb 1 03:33:40 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:33:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:33:46 localhost podman[89114]: 2026-02-01 08:33:46.704405667 +0000 UTC m=+0.067495520 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:33:47 localhost podman[89114]: 2026-02-01 08:33:47.091372901 +0000 UTC m=+0.454462784 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4) Feb 1 03:33:47 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:33:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:33:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:33:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:33:49 localhost podman[89142]: 2026-02-01 08:33:49.744943024 +0000 UTC m=+0.086543554 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5) Feb 1 03:33:49 localhost podman[89140]: 2026-02-01 08:33:49.807222183 +0000 UTC m=+0.155275282 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, container_name=metrics_qdr, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com) Feb 1 03:33:49 localhost systemd[1]: tmp-crun.19C8gc.mount: Deactivated successfully. Feb 1 03:33:49 localhost podman[89141]: 2026-02-01 08:33:49.876631912 +0000 UTC m=+0.222657978 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:33:49 localhost podman[89141]: 2026-02-01 08:33:49.920528097 +0000 UTC m=+0.266554153 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:33:49 localhost podman[89142]: 2026-02-01 08:33:49.930327478 +0000 UTC m=+0.271928068 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git) Feb 1 03:33:49 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:33:49 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:33:50 localhost podman[89140]: 2026-02-01 08:33:50.036099541 +0000 UTC m=+0.384152630 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc.) Feb 1 03:33:50 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:34:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:34:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:34:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:34:02 localhost systemd[1]: tmp-crun.VcMg8M.mount: Deactivated successfully. Feb 1 03:34:02 localhost podman[89214]: 2026-02-01 08:34:02.752529362 +0000 UTC m=+0.105272679 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, container_name=collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:34:02 localhost podman[89215]: 2026-02-01 08:34:02.77888512 +0000 UTC m=+0.129515532 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, container_name=nova_compute, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:34:02 localhost podman[89214]: 2026-02-01 08:34:02.839487428 +0000 UTC m=+0.192230735 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 1 03:34:02 localhost podman[89216]: 2026-02-01 08:34:02.846403459 +0000 UTC m=+0.191887103 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true) Feb 1 03:34:02 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:34:02 localhost podman[89215]: 2026-02-01 08:34:02.862933856 +0000 UTC m=+0.213564288 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1) Feb 1 03:34:02 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:34:02 localhost podman[89216]: 2026-02-01 08:34:02.885422586 +0000 UTC m=+0.230906230 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510) Feb 1 03:34:02 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:34:03 localhost systemd[1]: tmp-crun.06rMEu.mount: Deactivated successfully. Feb 1 03:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:34:11 localhost systemd[1]: tmp-crun.TynnAE.mount: Deactivated successfully. Feb 1 03:34:11 localhost podman[89283]: 2026-02-01 08:34:11.734947166 +0000 UTC m=+0.090827306 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5) Feb 1 03:34:11 localhost podman[89283]: 2026-02-01 08:34:11.772476637 +0000 UTC m=+0.128356747 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:34:11 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:34:11 localhost podman[89282]: 2026-02-01 08:34:11.793522762 +0000 UTC m=+0.152286030 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, container_name=ceilometer_agent_compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 1 03:34:11 localhost podman[89284]: 2026-02-01 08:34:11.834706644 +0000 UTC m=+0.187852350 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13) Feb 1 03:34:11 localhost podman[89282]: 2026-02-01 08:34:11.854533952 +0000 UTC m=+0.213297210 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, container_name=ceilometer_agent_compute, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:34:11 localhost podman[89284]: 2026-02-01 08:34:11.868417428 +0000 UTC m=+0.221563144 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:34:11 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:34:11 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:34:12 localhost systemd[1]: tmp-crun.RAGq0I.mount: Deactivated successfully. Feb 1 03:34:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:34:17 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:34:17 localhost recover_tripleo_nova_virtqemud[89353]: 61284 Feb 1 03:34:17 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:34:17 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:34:17 localhost systemd[1]: tmp-crun.wS2bq3.mount: Deactivated successfully. Feb 1 03:34:17 localhost podman[89351]: 2026-02-01 08:34:17.730034407 +0000 UTC m=+0.088763163 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4) Feb 1 03:34:18 localhost podman[89351]: 2026-02-01 08:34:18.141690551 +0000 UTC m=+0.500419287 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 1 03:34:18 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:34:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:34:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:34:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:34:20 localhost systemd[1]: tmp-crun.mNNsvC.mount: Deactivated successfully. Feb 1 03:34:20 localhost podman[89376]: 2026-02-01 08:34:20.742185813 +0000 UTC m=+0.091423455 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, version=17.1.13, tcib_managed=true, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=) Feb 1 03:34:20 localhost podman[89376]: 2026-02-01 08:34:20.768436798 +0000 UTC m=+0.117674410 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:34:20 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:34:20 localhost podman[89374]: 2026-02-01 08:34:20.791403122 +0000 UTC m=+0.145866815 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr) Feb 1 03:34:20 localhost podman[89375]: 2026-02-01 08:34:20.840529399 +0000 UTC m=+0.193290329 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=) Feb 1 03:34:20 localhost podman[89375]: 2026-02-01 08:34:20.887737356 +0000 UTC m=+0.240498216 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, distribution-scope=public, io.openshift.expose-services=) Feb 1 03:34:20 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:34:21 localhost podman[89374]: 2026-02-01 08:34:21.009463969 +0000 UTC m=+0.363927642 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=metrics_qdr, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:34:21 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:34:25 localhost podman[89548]: 2026-02-01 08:34:25.283369019 +0000 UTC m=+0.094247472 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1764794109, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, name=rhceph, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7) Feb 1 03:34:25 localhost podman[89548]: 2026-02-01 08:34:25.378797905 +0000 UTC m=+0.189676378 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 03:34:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:34:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:34:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:34:33 localhost podman[89692]: 2026-02-01 08:34:33.746216041 +0000 UTC m=+0.095567542 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5) Feb 1 03:34:33 localhost podman[89692]: 2026-02-01 08:34:33.778676616 +0000 UTC m=+0.128028087 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container) Feb 1 03:34:33 localhost systemd[1]: tmp-crun.wRxNkn.mount: Deactivated successfully. Feb 1 03:34:33 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:34:33 localhost podman[89691]: 2026-02-01 08:34:33.798759302 +0000 UTC m=+0.148520416 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:34:33 localhost podman[89691]: 2026-02-01 08:34:33.810512632 +0000 UTC m=+0.160273816 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:34:33 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:34:33 localhost podman[89693]: 2026-02-01 08:34:33.886474082 +0000 UTC m=+0.232384637 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 1 03:34:33 localhost podman[89693]: 2026-02-01 08:34:33.928400068 +0000 UTC m=+0.274310593 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step3, version=17.1.13, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Feb 1 03:34:33 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:34:40 localhost sshd[89756]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:34:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:34:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:34:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:34:42 localhost podman[89760]: 2026-02-01 08:34:42.735978914 +0000 UTC m=+0.084394360 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible) Feb 1 03:34:42 localhost podman[89760]: 2026-02-01 08:34:42.766761278 +0000 UTC m=+0.115176704 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Feb 1 03:34:42 localhost systemd[1]: tmp-crun.PC1pej.mount: Deactivated successfully. Feb 1 03:34:42 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:34:42 localhost podman[89759]: 2026-02-01 08:34:42.791236478 +0000 UTC m=+0.140112188 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:34:42 localhost podman[89759]: 2026-02-01 08:34:42.82488282 +0000 UTC m=+0.173758530 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:34:42 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:34:42 localhost podman[89758]: 2026-02-01 08:34:42.845117841 +0000 UTC m=+0.197274421 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:34:42 localhost podman[89758]: 2026-02-01 08:34:42.886339714 +0000 UTC m=+0.238496334 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 1 03:34:42 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:34:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:34:48 localhost systemd[1]: tmp-crun.WQYCTc.mount: Deactivated successfully. Feb 1 03:34:48 localhost podman[89850]: 2026-02-01 08:34:48.728411046 +0000 UTC m=+0.084809233 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com) Feb 1 03:34:49 localhost podman[89850]: 2026-02-01 08:34:49.178499738 +0000 UTC m=+0.534897885 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z) Feb 1 03:34:49 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:34:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:34:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:34:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:34:51 localhost systemd[1]: tmp-crun.RU3d7K.mount: Deactivated successfully. Feb 1 03:34:51 localhost podman[89876]: 2026-02-01 08:34:51.73796638 +0000 UTC m=+0.090650991 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Feb 1 03:34:51 localhost systemd[1]: tmp-crun.VbmwdU.mount: Deactivated successfully. Feb 1 03:34:51 localhost podman[89877]: 2026-02-01 08:34:51.765045231 +0000 UTC m=+0.114102391 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 1 03:34:51 localhost podman[89878]: 2026-02-01 08:34:51.77874797 +0000 UTC m=+0.126215172 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller) Feb 1 03:34:51 localhost podman[89878]: 2026-02-01 08:34:51.831617341 +0000 UTC m=+0.179084503 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:34:51 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:34:51 localhost podman[89877]: 2026-02-01 08:34:51.887956739 +0000 UTC m=+0.237013899 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:34:51 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:34:51 localhost podman[89876]: 2026-02-01 08:34:51.957494802 +0000 UTC m=+0.310179423 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public) Feb 1 03:34:51 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:35:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:35:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:35:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:35:04 localhost podman[89952]: 2026-02-01 08:35:04.756683789 +0000 UTC m=+0.105330451 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1766032510, maintainer=OpenStack TripleO Team, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:35:04 localhost podman[89952]: 2026-02-01 08:35:04.772314208 +0000 UTC m=+0.120960910 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510) Feb 1 03:35:04 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:35:04 localhost systemd[1]: tmp-crun.OkTqY1.mount: Deactivated successfully. Feb 1 03:35:04 localhost podman[89954]: 2026-02-01 08:35:04.86336444 +0000 UTC m=+0.205502923 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:35:04 localhost podman[89953]: 2026-02-01 08:35:04.906749551 +0000 UTC m=+0.253329370 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 1 03:35:04 localhost podman[89954]: 2026-02-01 08:35:04.929589661 +0000 UTC m=+0.271728074 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:35:04 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:35:04 localhost podman[89953]: 2026-02-01 08:35:04.98530886 +0000 UTC m=+0.331888669 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, version=17.1.13, build-date=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:35:05 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:35:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:35:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:35:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:35:13 localhost systemd[1]: tmp-crun.qI4wRw.mount: Deactivated successfully. Feb 1 03:35:13 localhost podman[90015]: 2026-02-01 08:35:13.777587317 +0000 UTC m=+0.136338472 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:35:13 localhost podman[90014]: 2026-02-01 08:35:13.740041416 +0000 UTC m=+0.097713958 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-cron-container) Feb 1 03:35:13 localhost podman[90013]: 2026-02-01 08:35:13.697077048 +0000 UTC m=+0.062171608 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 1 03:35:13 localhost podman[90015]: 2026-02-01 08:35:13.808141094 +0000 UTC m=+0.166892229 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1766032510) Feb 1 03:35:13 localhost podman[90014]: 2026-02-01 08:35:13.819575045 +0000 UTC m=+0.177247567 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:35:13 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:35:13 localhost podman[90013]: 2026-02-01 08:35:13.827634492 +0000 UTC m=+0.192729102 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, release=1766032510, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 1 03:35:13 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:35:13 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:35:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:35:19 localhost systemd[1]: tmp-crun.d6lcEv.mount: Deactivated successfully. Feb 1 03:35:19 localhost podman[90088]: 2026-02-01 08:35:19.738827762 +0000 UTC m=+0.094728096 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public) Feb 1 03:35:20 localhost podman[90088]: 2026-02-01 08:35:20.120429675 +0000 UTC m=+0.476330009 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true) Feb 1 03:35:20 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:35:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:35:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:35:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:35:22 localhost podman[90118]: 2026-02-01 08:35:22.73382997 +0000 UTC m=+0.081498640 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=) Feb 1 03:35:22 localhost podman[90118]: 2026-02-01 08:35:22.78761605 +0000 UTC m=+0.135284760 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, version=17.1.13, container_name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:35:22 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:35:22 localhost podman[90112]: 2026-02-01 08:35:22.837341215 +0000 UTC m=+0.188572744 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 1 03:35:22 localhost podman[90112]: 2026-02-01 08:35:22.882409087 +0000 UTC m=+0.233640656 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1) Feb 1 03:35:22 localhost podman[90111]: 2026-02-01 08:35:22.789107886 +0000 UTC m=+0.145147853 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20260112.1) Feb 1 03:35:22 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:35:23 localhost podman[90111]: 2026-02-01 08:35:23.013910149 +0000 UTC m=+0.369950086 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, tcib_managed=true) Feb 1 03:35:23 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:35:23 localhost systemd[1]: tmp-crun.NQyfpJ.mount: Deactivated successfully. Feb 1 03:35:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:35:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:35:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:35:35 localhost systemd[1]: tmp-crun.jqNBgH.mount: Deactivated successfully. Feb 1 03:35:35 localhost systemd[1]: tmp-crun.QPJdKy.mount: Deactivated successfully. Feb 1 03:35:35 localhost podman[90269]: 2026-02-01 08:35:35.8030688 +0000 UTC m=+0.146317799 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc.) Feb 1 03:35:35 localhost podman[90269]: 2026-02-01 08:35:35.842394546 +0000 UTC m=+0.185643545 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 1 03:35:35 localhost podman[90267]: 2026-02-01 08:35:35.848859624 +0000 UTC m=+0.197442995 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, container_name=collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 1 03:35:35 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:35:35 localhost podman[90267]: 2026-02-01 08:35:35.858968805 +0000 UTC m=+0.207552176 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:35:35 localhost podman[90268]: 2026-02-01 08:35:35.768430937 +0000 UTC m=+0.115500562 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, distribution-scope=public) Feb 1 03:35:35 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:35:35 localhost podman[90268]: 2026-02-01 08:35:35.905313385 +0000 UTC m=+0.252382980 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64) Feb 1 03:35:35 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:35:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:35:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:35:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:35:44 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:35:44 localhost recover_tripleo_nova_virtqemud[90350]: 61284 Feb 1 03:35:44 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:35:44 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:35:44 localhost podman[90333]: 2026-02-01 08:35:44.720423161 +0000 UTC m=+0.079493759 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:35:44 localhost podman[90332]: 2026-02-01 08:35:44.758089056 +0000 UTC m=+0.118474194 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:35:44 localhost podman[90331]: 2026-02-01 08:35:44.776030917 +0000 UTC m=+0.132688621 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Feb 1 03:35:44 localhost podman[90332]: 2026-02-01 08:35:44.775331965 +0000 UTC m=+0.135717103 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, container_name=logrotate_crond, io.buildah.version=1.41.5) Feb 1 03:35:44 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:35:44 localhost podman[90333]: 2026-02-01 08:35:44.825897035 +0000 UTC m=+0.184967653 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:35:44 localhost podman[90331]: 2026-02-01 08:35:44.826449863 +0000 UTC m=+0.183107567 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team) Feb 1 03:35:44 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:35:44 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:35:45 localhost sshd[90408]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:35:45 localhost systemd[1]: tmp-crun.if68yK.mount: Deactivated successfully. Feb 1 03:35:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:35:50 localhost systemd[1]: tmp-crun.RhzVjV.mount: Deactivated successfully. Feb 1 03:35:50 localhost podman[90410]: 2026-02-01 08:35:50.724924096 +0000 UTC m=+0.083450231 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:35:51 localhost podman[90410]: 2026-02-01 08:35:51.034428708 +0000 UTC m=+0.392954893 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 1 03:35:51 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:35:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:35:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:35:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:35:53 localhost systemd[1]: tmp-crun.ghyi1x.mount: Deactivated successfully. Feb 1 03:35:53 localhost podman[90435]: 2026-02-01 08:35:53.713496418 +0000 UTC m=+0.068689797 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc.) Feb 1 03:35:53 localhost podman[90433]: 2026-02-01 08:35:53.745747398 +0000 UTC m=+0.109894442 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:35:53 localhost podman[90434]: 2026-02-01 08:35:53.751243306 +0000 UTC m=+0.111708347 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1) Feb 1 03:35:53 localhost podman[90435]: 2026-02-01 08:35:53.759652794 +0000 UTC m=+0.114846163 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_id=tripleo_step4, container_name=ovn_controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, version=17.1.13) Feb 1 03:35:53 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:35:53 localhost podman[90434]: 2026-02-01 08:35:53.799395123 +0000 UTC m=+0.159860114 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1) Feb 1 03:35:53 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:35:53 localhost podman[90433]: 2026-02-01 08:35:53.917519345 +0000 UTC m=+0.281666459 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510) Feb 1 03:35:53 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:35:54 localhost systemd[1]: tmp-crun.wmHvNn.mount: Deactivated successfully. Feb 1 03:36:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:36:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:36:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:36:06 localhost podman[90509]: 2026-02-01 08:36:06.735198028 +0000 UTC m=+0.089121133 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, io.openshift.expose-services=, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:36:06 localhost podman[90509]: 2026-02-01 08:36:06.7678633 +0000 UTC m=+0.121786405 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, batch=17.1_20260112.1) Feb 1 03:36:06 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:36:06 localhost podman[90508]: 2026-02-01 08:36:06.813939553 +0000 UTC m=+0.167714674 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:36:06 localhost podman[90507]: 2026-02-01 08:36:06.7682036 +0000 UTC m=+0.121842937 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Feb 1 03:36:06 localhost podman[90507]: 2026-02-01 08:36:06.850358049 +0000 UTC m=+0.203997396 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd) Feb 1 03:36:06 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:36:06 localhost podman[90508]: 2026-02-01 08:36:06.866325949 +0000 UTC m=+0.220101110 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, container_name=nova_compute, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:36:06 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:36:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:36:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:36:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:36:15 localhost systemd[1]: tmp-crun.Ta6WkG.mount: Deactivated successfully. Feb 1 03:36:15 localhost podman[90571]: 2026-02-01 08:36:15.708165574 +0000 UTC m=+0.070990338 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64) Feb 1 03:36:15 localhost podman[90572]: 2026-02-01 08:36:15.733093589 +0000 UTC m=+0.088629689 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:36:15 localhost podman[90573]: 2026-02-01 08:36:15.787696294 +0000 UTC m=+0.141495001 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, architecture=x86_64, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 1 03:36:15 localhost podman[90571]: 2026-02-01 08:36:15.811961957 +0000 UTC m=+0.174786761 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container) Feb 1 03:36:15 localhost podman[90572]: 2026-02-01 08:36:15.820066956 +0000 UTC m=+0.175603076 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.) Feb 1 03:36:15 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:36:15 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:36:15 localhost podman[90573]: 2026-02-01 08:36:15.836351915 +0000 UTC m=+0.190150622 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z) Feb 1 03:36:15 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:36:16 localhost systemd[1]: tmp-crun.QCd9IR.mount: Deactivated successfully. Feb 1 03:36:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:36:21 localhost systemd[1]: tmp-crun.EIiJ1Q.mount: Deactivated successfully. Feb 1 03:36:21 localhost podman[90642]: 2026-02-01 08:36:21.723181038 +0000 UTC m=+0.083079519 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com) Feb 1 03:36:22 localhost podman[90642]: 2026-02-01 08:36:22.072484969 +0000 UTC m=+0.432383480 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:36:22 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:36:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:36:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:36:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:36:24 localhost podman[90665]: 2026-02-01 08:36:24.729335719 +0000 UTC m=+0.082400848 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z) Feb 1 03:36:24 localhost podman[90667]: 2026-02-01 08:36:24.778208128 +0000 UTC m=+0.128507183 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 1 03:36:24 localhost systemd[1]: tmp-crun.ARiWqv.mount: Deactivated successfully. Feb 1 03:36:24 localhost podman[90666]: 2026-02-01 08:36:24.837292079 +0000 UTC m=+0.187106439 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:36:24 localhost podman[90667]: 2026-02-01 08:36:24.853756794 +0000 UTC m=+0.204055909 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:36:24 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:36:24 localhost podman[90666]: 2026-02-01 08:36:24.914432086 +0000 UTC m=+0.264246436 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1) Feb 1 03:36:24 localhost podman[90665]: 2026-02-01 08:36:24.924394931 +0000 UTC m=+0.277460040 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 1 03:36:24 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:36:24 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:36:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:36:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:36:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:36:37 localhost podman[90817]: 2026-02-01 08:36:37.711659512 +0000 UTC m=+0.072042870 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible) Feb 1 03:36:37 localhost systemd[1]: tmp-crun.Hl57xx.mount: Deactivated successfully. Feb 1 03:36:37 localhost podman[90817]: 2026-02-01 08:36:37.727433575 +0000 UTC m=+0.087816933 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, distribution-scope=public) Feb 1 03:36:37 localhost podman[90818]: 2026-02-01 08:36:37.735036639 +0000 UTC m=+0.088742812 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:36:37 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:36:37 localhost podman[90818]: 2026-02-01 08:36:37.784480465 +0000 UTC m=+0.138186628 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 1 03:36:37 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:36:37 localhost podman[90819]: 2026-02-01 08:36:37.86975495 +0000 UTC m=+0.222165874 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=iscsid, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.component=openstack-iscsid-container) Feb 1 03:36:37 localhost podman[90819]: 2026-02-01 08:36:37.903278338 +0000 UTC m=+0.255689302 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid) Feb 1 03:36:37 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:36:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:36:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:36:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:36:46 localhost systemd[1]: tmp-crun.rVl4MF.mount: Deactivated successfully. Feb 1 03:36:46 localhost podman[90882]: 2026-02-01 08:36:46.789220706 +0000 UTC m=+0.138110286 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:36:46 localhost podman[90882]: 2026-02-01 08:36:46.799390639 +0000 UTC m=+0.148280259 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, container_name=logrotate_crond, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible) Feb 1 03:36:46 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:36:46 localhost podman[90883]: 2026-02-01 08:36:46.844529123 +0000 UTC m=+0.188805361 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:36:46 localhost podman[90881]: 2026-02-01 08:36:46.754145851 +0000 UTC m=+0.105842307 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5) Feb 1 03:36:46 localhost podman[90883]: 2026-02-01 08:36:46.872379957 +0000 UTC m=+0.216656195 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public) Feb 1 03:36:46 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:36:46 localhost podman[90881]: 2026-02-01 08:36:46.882742335 +0000 UTC m=+0.234438771 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, version=17.1.13) Feb 1 03:36:46 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:36:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:36:52 localhost podman[90956]: 2026-02-01 08:36:52.700288793 +0000 UTC m=+0.065158149 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:36:53 localhost podman[90956]: 2026-02-01 08:36:53.082284518 +0000 UTC m=+0.447153834 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:36:53 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:36:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:36:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:36:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:36:55 localhost systemd[1]: Starting dnf makecache... Feb 1 03:36:55 localhost systemd[1]: tmp-crun.JY1cB1.mount: Deactivated successfully. Feb 1 03:36:55 localhost podman[90980]: 2026-02-01 08:36:55.735099063 +0000 UTC m=+0.086160203 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc.) Feb 1 03:36:55 localhost podman[90981]: 2026-02-01 08:36:55.801418086 +0000 UTC m=+0.148164124 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:36:55 localhost podman[90982]: 2026-02-01 08:36:55.847329805 +0000 UTC m=+0.188907385 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:36:55 localhost podman[90981]: 2026-02-01 08:36:55.85143531 +0000 UTC m=+0.198181298 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git) Feb 1 03:36:55 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:36:55 localhost podman[90982]: 2026-02-01 08:36:55.874353193 +0000 UTC m=+0.215930773 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:36:55 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:36:55 localhost dnf[90986]: Updating Subscription Management repositories. Feb 1 03:36:55 localhost podman[90980]: 2026-02-01 08:36:55.991278609 +0000 UTC m=+0.342339719 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:36:56 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:36:56 localhost systemd[1]: tmp-crun.j3DpCS.mount: Deactivated successfully. Feb 1 03:36:57 localhost dnf[90986]: Metadata cache refreshed recently. Feb 1 03:36:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:36:57 localhost recover_tripleo_nova_virtqemud[91055]: 61284 Feb 1 03:36:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:36:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:36:58 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Feb 1 03:36:58 localhost systemd[1]: Finished dnf makecache. Feb 1 03:36:58 localhost systemd[1]: dnf-makecache.service: Consumed 2.492s CPU time. Feb 1 03:37:00 localhost sshd[91056]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:37:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:37:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:37:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:37:08 localhost systemd[1]: tmp-crun.vigSDe.mount: Deactivated successfully. Feb 1 03:37:08 localhost podman[91060]: 2026-02-01 08:37:08.764623894 +0000 UTC m=+0.104271439 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:37:08 localhost podman[91059]: 2026-02-01 08:37:08.738783531 +0000 UTC m=+0.082885682 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, url=https://www.redhat.com, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 1 03:37:08 localhost podman[91058]: 2026-02-01 08:37:08.744753625 +0000 UTC m=+0.087286729 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:37:08 localhost podman[91059]: 2026-02-01 08:37:08.821317363 +0000 UTC m=+0.165419504 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, version=17.1.13, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git) Feb 1 03:37:08 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:37:08 localhost podman[91058]: 2026-02-01 08:37:08.82971755 +0000 UTC m=+0.172250654 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, container_name=collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 1 03:37:08 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:37:08 localhost podman[91060]: 2026-02-01 08:37:08.851111527 +0000 UTC m=+0.190759072 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1766032510, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:37:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:37:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:37:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:37:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:37:17 localhost systemd[1]: tmp-crun.gTO30G.mount: Deactivated successfully. Feb 1 03:37:17 localhost systemd[1]: tmp-crun.xxggN8.mount: Deactivated successfully. Feb 1 03:37:17 localhost podman[91122]: 2026-02-01 08:37:17.788017121 +0000 UTC m=+0.141556353 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:37:17 localhost podman[91121]: 2026-02-01 08:37:17.745248398 +0000 UTC m=+0.101717990 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510) Feb 1 03:37:17 localhost podman[91122]: 2026-02-01 08:37:17.82615549 +0000 UTC m=+0.179694702 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, release=1766032510) Feb 1 03:37:17 localhost podman[91121]: 2026-02-01 08:37:17.826565942 +0000 UTC m=+0.183035524 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1) Feb 1 03:37:17 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:37:17 localhost podman[91123]: 2026-02-01 08:37:17.842433479 +0000 UTC m=+0.193120544 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:37:17 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:37:17 localhost podman[91123]: 2026-02-01 08:37:17.901193911 +0000 UTC m=+0.251881006 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:37:17 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:37:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:37:23 localhost systemd[1]: tmp-crun.ss2ASl.mount: Deactivated successfully. Feb 1 03:37:23 localhost podman[91194]: 2026-02-01 08:37:23.760627863 +0000 UTC m=+0.123144868 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.buildah.version=1.41.5) Feb 1 03:37:24 localhost podman[91194]: 2026-02-01 08:37:24.157549335 +0000 UTC m=+0.520066290 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:37:24 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:37:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:37:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:37:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:37:26 localhost systemd[1]: tmp-crun.ywktH4.mount: Deactivated successfully. Feb 1 03:37:26 localhost podman[91218]: 2026-02-01 08:37:26.746329226 +0000 UTC m=+0.100311708 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:10:14Z, architecture=x86_64, batch=17.1_20260112.1) Feb 1 03:37:26 localhost podman[91219]: 2026-02-01 08:37:26.842563337 +0000 UTC m=+0.194502067 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, build-date=2026-01-12T22:56:19Z, distribution-scope=public, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, container_name=ovn_metadata_agent, version=17.1.13) Feb 1 03:37:26 localhost podman[91220]: 2026-02-01 08:37:26.904856067 +0000 UTC m=+0.254175646 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5) Feb 1 03:37:26 localhost podman[91219]: 2026-02-01 08:37:26.924432467 +0000 UTC m=+0.276371197 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:37:26 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:37:26 localhost podman[91218]: 2026-02-01 08:37:26.944336368 +0000 UTC m=+0.298318840 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:37:26 localhost podman[91220]: 2026-02-01 08:37:26.960399291 +0000 UTC m=+0.309718880 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, container_name=ovn_controller, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:37:26 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:37:26 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:37:27 localhost systemd[1]: tmp-crun.fOeAB8.mount: Deactivated successfully. Feb 1 03:37:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:37:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:37:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:37:39 localhost podman[91370]: 2026-02-01 08:37:39.744109662 +0000 UTC m=+0.098884253 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:10:15Z, container_name=collectd, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Feb 1 03:37:39 localhost podman[91371]: 2026-02-01 08:37:39.79132171 +0000 UTC m=+0.144370648 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:37:39 localhost systemd[1]: tmp-crun.5zj3r2.mount: Deactivated successfully. Feb 1 03:37:39 localhost podman[91372]: 2026-02-01 08:37:39.850855266 +0000 UTC m=+0.199375055 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:37:39 localhost podman[91371]: 2026-02-01 08:37:39.856467558 +0000 UTC m=+0.209516456 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true) Feb 1 03:37:39 localhost podman[91370]: 2026-02-01 08:37:39.863185424 +0000 UTC m=+0.217960005 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.) Feb 1 03:37:39 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:37:39 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:37:39 localhost podman[91372]: 2026-02-01 08:37:39.89043865 +0000 UTC m=+0.238958459 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, version=17.1.13) Feb 1 03:37:39 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:37:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:37:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:37:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:37:48 localhost systemd[1]: tmp-crun.kgMQyd.mount: Deactivated successfully. Feb 1 03:37:48 localhost podman[91435]: 2026-02-01 08:37:48.797798193 +0000 UTC m=+0.141287794 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1) Feb 1 03:37:48 localhost podman[91434]: 2026-02-01 08:37:48.846552299 +0000 UTC m=+0.195409544 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible) Feb 1 03:37:48 localhost podman[91433]: 2026-02-01 08:37:48.768314329 +0000 UTC m=+0.119898118 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 1 03:37:48 localhost podman[91434]: 2026-02-01 08:37:48.877541349 +0000 UTC m=+0.226398624 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, container_name=logrotate_crond, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:37:48 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:37:48 localhost podman[91433]: 2026-02-01 08:37:48.898568223 +0000 UTC m=+0.250152022 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team) Feb 1 03:37:48 localhost podman[91435]: 2026-02-01 08:37:48.923496568 +0000 UTC m=+0.266986129 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, release=1766032510, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13) Feb 1 03:37:48 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:37:48 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:37:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:37:54 localhost systemd[1]: tmp-crun.Qdxp7g.mount: Deactivated successfully. Feb 1 03:37:54 localhost podman[91508]: 2026-02-01 08:37:54.733691859 +0000 UTC m=+0.093404715 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 1 03:37:55 localhost podman[91508]: 2026-02-01 08:37:55.098568259 +0000 UTC m=+0.458281095 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:37:55 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:37:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:37:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:37:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:37:57 localhost podman[91531]: 2026-02-01 08:37:57.720570678 +0000 UTC m=+0.085269385 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, release=1766032510, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc.) Feb 1 03:37:57 localhost systemd[1]: tmp-crun.3srqqS.mount: Deactivated successfully. Feb 1 03:37:57 localhost podman[91532]: 2026-02-01 08:37:57.795180357 +0000 UTC m=+0.153401916 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:37:57 localhost podman[91533]: 2026-02-01 08:37:57.850500254 +0000 UTC m=+0.205550756 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=ovn_controller) Feb 1 03:37:57 localhost podman[91532]: 2026-02-01 08:37:57.877286125 +0000 UTC m=+0.235507644 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:37:57 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:37:57 localhost podman[91533]: 2026-02-01 08:37:57.908352137 +0000 UTC m=+0.263402599 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, release=1766032510, vcs-type=git, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true) Feb 1 03:37:57 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:37:57 localhost podman[91531]: 2026-02-01 08:37:57.957414752 +0000 UTC m=+0.322113439 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510) Feb 1 03:37:57 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:37:58 localhost systemd[1]: tmp-crun.HyorEa.mount: Deactivated successfully. Feb 1 03:38:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:38:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:38:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:38:10 localhost systemd[1]: tmp-crun.bwH7zu.mount: Deactivated successfully. Feb 1 03:38:10 localhost podman[91607]: 2026-02-01 08:38:10.75858537 +0000 UTC m=+0.078334993 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:38:10 localhost podman[91606]: 2026-02-01 08:38:10.823102248 +0000 UTC m=+0.141674115 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13) Feb 1 03:38:10 localhost podman[91606]: 2026-02-01 08:38:10.832694703 +0000 UTC m=+0.151266590 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3) Feb 1 03:38:10 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:38:10 localhost podman[91608]: 2026-02-01 08:38:10.796796422 +0000 UTC m=+0.112048017 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 03:38:10 localhost podman[91608]: 2026-02-01 08:38:10.880340024 +0000 UTC m=+0.195591619 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:38:10 localhost podman[91607]: 2026-02-01 08:38:10.889623469 +0000 UTC m=+0.209373052 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=) Feb 1 03:38:10 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:38:10 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:38:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:38:19 localhost podman[91670]: 2026-02-01 08:38:19.726091548 +0000 UTC m=+0.083674706 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute) Feb 1 03:38:19 localhost systemd[1]: tmp-crun.2kLnTJ.mount: Deactivated successfully. Feb 1 03:38:19 localhost podman[91674]: 2026-02-01 08:38:19.751421495 +0000 UTC m=+0.098050868 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 1 03:38:19 localhost podman[91674]: 2026-02-01 08:38:19.781462526 +0000 UTC m=+0.128091919 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:38:19 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:38:19 localhost podman[91670]: 2026-02-01 08:38:19.83703095 +0000 UTC m=+0.194614138 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:38:19 localhost podman[91671]: 2026-02-01 08:38:19.786944854 +0000 UTC m=+0.137687223 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public) Feb 1 03:38:19 localhost podman[91671]: 2026-02-01 08:38:19.866627289 +0000 UTC m=+0.217369658 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Feb 1 03:38:19 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:38:19 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:38:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:38:25 localhost podman[91743]: 2026-02-01 08:38:25.71626242 +0000 UTC m=+0.077554689 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z) Feb 1 03:38:26 localhost podman[91743]: 2026-02-01 08:38:26.083365868 +0000 UTC m=+0.444658137 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64) Feb 1 03:38:26 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:38:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:38:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:38:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:38:28 localhost systemd[1]: tmp-crun.WHttPA.mount: Deactivated successfully. Feb 1 03:38:28 localhost podman[91766]: 2026-02-01 08:38:28.725202735 +0000 UTC m=+0.087543535 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:38:28 localhost podman[91767]: 2026-02-01 08:38:28.772825716 +0000 UTC m=+0.130860345 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, io.openshift.expose-services=, container_name=ovn_metadata_agent, url=https://www.redhat.com, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:38:28 localhost podman[91768]: 2026-02-01 08:38:28.81765398 +0000 UTC m=+0.173446030 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.openshift.expose-services=) Feb 1 03:38:28 localhost podman[91767]: 2026-02-01 08:38:28.854722117 +0000 UTC m=+0.212756796 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:38:28 localhost podman[91768]: 2026-02-01 08:38:28.863241628 +0000 UTC m=+0.219033668 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:38:28 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:38:28 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Deactivated successfully. Feb 1 03:38:28 localhost podman[91766]: 2026-02-01 08:38:28.906335239 +0000 UTC m=+0.268676029 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 1 03:38:28 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:38:29 localhost systemd[1]: tmp-crun.HaKoMV.mount: Deactivated successfully. Feb 1 03:38:33 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:38:33 localhost recover_tripleo_nova_virtqemud[91920]: 61284 Feb 1 03:38:33 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:38:33 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:38:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:38:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:38:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:38:41 localhost podman[91922]: 2026-02-01 08:38:41.748635211 +0000 UTC m=+0.096487410 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:38:41 localhost podman[91922]: 2026-02-01 08:38:41.784377416 +0000 UTC m=+0.132229655 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Feb 1 03:38:41 localhost systemd[1]: tmp-crun.F6bdzL.mount: Deactivated successfully. Feb 1 03:38:41 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:38:41 localhost podman[91923]: 2026-02-01 08:38:41.848957087 +0000 UTC m=+0.196027281 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:38:41 localhost podman[91923]: 2026-02-01 08:38:41.862327236 +0000 UTC m=+0.209397430 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=iscsid, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc.) Feb 1 03:38:41 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:38:41 localhost podman[91921]: 2026-02-01 08:38:41.813279292 +0000 UTC m=+0.161827622 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, container_name=collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13) Feb 1 03:38:41 localhost podman[91921]: 2026-02-01 08:38:41.9495553 +0000 UTC m=+0.298103590 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:38:41 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:38:42 localhost systemd[1]: tmp-crun.y0h13L.mount: Deactivated successfully. Feb 1 03:38:43 localhost sshd[91987]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:38:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:38:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:38:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:38:50 localhost systemd[1]: tmp-crun.4FjT54.mount: Deactivated successfully. Feb 1 03:38:50 localhost podman[91990]: 2026-02-01 08:38:50.739910525 +0000 UTC m=+0.096879090 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z) Feb 1 03:38:50 localhost podman[91990]: 2026-02-01 08:38:50.774449975 +0000 UTC m=+0.131418540 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible) Feb 1 03:38:50 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:38:50 localhost podman[91992]: 2026-02-01 08:38:50.795454028 +0000 UTC m=+0.146935805 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public) Feb 1 03:38:50 localhost podman[91992]: 2026-02-01 08:38:50.855347164 +0000 UTC m=+0.206828971 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true) Feb 1 03:38:50 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:38:50 localhost podman[91991]: 2026-02-01 08:38:50.872377007 +0000 UTC m=+0.226729212 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:38:50 localhost podman[91991]: 2026-02-01 08:38:50.881299401 +0000 UTC m=+0.235651596 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=logrotate_crond) Feb 1 03:38:50 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:38:51 localhost systemd[1]: tmp-crun.pb8lyd.mount: Deactivated successfully. Feb 1 03:38:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:38:56 localhost podman[92062]: 2026-02-01 08:38:56.729087692 +0000 UTC m=+0.090553167 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:38:57 localhost podman[92062]: 2026-02-01 08:38:57.098461156 +0000 UTC m=+0.459926611 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 1 03:38:57 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:38:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:38:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:38:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:38:59 localhost podman[92085]: 2026-02-01 08:38:59.732449799 +0000 UTC m=+0.086923536 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 1 03:38:59 localhost systemd[1]: tmp-crun.4b0w0M.mount: Deactivated successfully. Feb 1 03:38:59 localhost podman[92085]: 2026-02-01 08:38:59.799460063 +0000 UTC m=+0.153933870 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, version=17.1.13, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:38:59 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:38:59 localhost podman[92084]: 2026-02-01 08:38:59.800528256 +0000 UTC m=+0.158163690 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr) Feb 1 03:38:59 localhost podman[92086]: 2026-02-01 08:38:59.89980946 +0000 UTC m=+0.253057039 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller) Feb 1 03:38:59 localhost podman[92086]: 2026-02-01 08:38:59.94843867 +0000 UTC m=+0.301686269 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, version=17.1.13, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:38:59 localhost podman[92086]: unhealthy Feb 1 03:38:59 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:38:59 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:39:00 localhost podman[92084]: 2026-02-01 08:39:00.037564803 +0000 UTC m=+0.395200207 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:39:00 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:39:00 localhost systemd[1]: tmp-crun.yHZleG.mount: Deactivated successfully. Feb 1 03:39:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:39:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:39:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:39:12 localhost podman[92163]: 2026-02-01 08:39:12.741967573 +0000 UTC m=+0.095514309 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:39:12 localhost podman[92163]: 2026-02-01 08:39:12.750463513 +0000 UTC m=+0.104010269 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, container_name=collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true, vcs-type=git) Feb 1 03:39:12 localhost systemd[1]: tmp-crun.fKwStp.mount: Deactivated successfully. Feb 1 03:39:12 localhost podman[92165]: 2026-02-01 08:39:12.800492667 +0000 UTC m=+0.147231474 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, architecture=x86_64) Feb 1 03:39:12 localhost podman[92165]: 2026-02-01 08:39:12.835279694 +0000 UTC m=+0.182018491 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:39:12 localhost podman[92164]: 2026-02-01 08:39:12.838297717 +0000 UTC m=+0.189016507 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, container_name=nova_compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:39:12 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:39:12 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:39:12 localhost podman[92164]: 2026-02-01 08:39:12.924448647 +0000 UTC m=+0.275167407 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:39:12 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:39:17 localhost sshd[92228]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:39:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:39:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:39:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:39:21 localhost systemd[1]: tmp-crun.CbvEaG.mount: Deactivated successfully. Feb 1 03:39:21 localhost podman[92230]: 2026-02-01 08:39:21.733508426 +0000 UTC m=+0.093298881 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z, release=1766032510, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute) Feb 1 03:39:21 localhost systemd[1]: tmp-crun.oTVVF0.mount: Deactivated successfully. Feb 1 03:39:21 localhost podman[92237]: 2026-02-01 08:39:21.75778121 +0000 UTC m=+0.104397432 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, batch=17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:39:21 localhost podman[92237]: 2026-02-01 08:39:21.78942414 +0000 UTC m=+0.136040322 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1766032510, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:39:21 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:39:21 localhost podman[92231]: 2026-02-01 08:39:21.843747955 +0000 UTC m=+0.194372969 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, container_name=logrotate_crond, architecture=x86_64, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:39:21 localhost podman[92231]: 2026-02-01 08:39:21.855348111 +0000 UTC m=+0.205973045 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:39:21 localhost podman[92230]: 2026-02-01 08:39:21.867544075 +0000 UTC m=+0.227334570 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:39:21 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:39:21 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:39:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:39:27 localhost systemd[1]: tmp-crun.Ap8IBH.mount: Deactivated successfully. Feb 1 03:39:27 localhost podman[92301]: 2026-02-01 08:39:27.732557634 +0000 UTC m=+0.093445785 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 1 03:39:28 localhost podman[92301]: 2026-02-01 08:39:28.135545939 +0000 UTC m=+0.496434060 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:39:28 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:39:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:39:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:39:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:39:30 localhost systemd[1]: tmp-crun.SkdFYI.mount: Deactivated successfully. Feb 1 03:39:30 localhost podman[92325]: 2026-02-01 08:39:30.793017003 +0000 UTC m=+0.141663414 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, release=1766032510, vcs-type=git) Feb 1 03:39:30 localhost podman[92326]: 2026-02-01 08:39:30.849487014 +0000 UTC m=+0.194411611 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, version=17.1.13) Feb 1 03:39:30 localhost podman[92324]: 2026-02-01 08:39:30.762191918 +0000 UTC m=+0.114713168 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:39:30 localhost podman[92326]: 2026-02-01 08:39:30.870325883 +0000 UTC m=+0.215250460 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:39:30 localhost podman[92326]: unhealthy Feb 1 03:39:30 localhost podman[92325]: 2026-02-01 08:39:30.877945556 +0000 UTC m=+0.226591957 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:39:30 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:39:30 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:39:30 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Deactivated successfully. Feb 1 03:39:30 localhost podman[92324]: 2026-02-01 08:39:30.961415426 +0000 UTC m=+0.313936616 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:39:30 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:39:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:39:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:39:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:39:43 localhost systemd[1]: tmp-crun.nCYUYm.mount: Deactivated successfully. Feb 1 03:39:43 localhost podman[92482]: 2026-02-01 08:39:43.734131052 +0000 UTC m=+0.086035168 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, version=17.1.13, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, config_id=tripleo_step5) Feb 1 03:39:43 localhost podman[92483]: 2026-02-01 08:39:43.747848143 +0000 UTC m=+0.091988792 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:39:43 localhost podman[92482]: 2026-02-01 08:39:43.791857252 +0000 UTC m=+0.143761098 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 1 03:39:43 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:39:43 localhost podman[92481]: 2026-02-01 08:39:43.794442012 +0000 UTC m=+0.145627526 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Feb 1 03:39:43 localhost podman[92481]: 2026-02-01 08:39:43.87528982 +0000 UTC m=+0.226475324 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, vcs-type=git, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:39:43 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:39:43 localhost podman[92483]: 2026-02-01 08:39:43.930530933 +0000 UTC m=+0.274671582 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.13) Feb 1 03:39:43 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:39:44 localhost sshd[92544]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:39:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:39:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:39:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:39:52 localhost systemd[1]: tmp-crun.KsHnxH.mount: Deactivated successfully. Feb 1 03:39:52 localhost podman[92547]: 2026-02-01 08:39:52.788442759 +0000 UTC m=+0.142158819 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:39:52 localhost podman[92545]: 2026-02-01 08:39:52.746004128 +0000 UTC m=+0.107600019 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, version=17.1.13, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Feb 1 03:39:52 localhost podman[92545]: 2026-02-01 08:39:52.828467857 +0000 UTC m=+0.190063768 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:39:52 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:39:52 localhost podman[92547]: 2026-02-01 08:39:52.880089729 +0000 UTC m=+0.233805749 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 1 03:39:52 localhost podman[92546]: 2026-02-01 08:39:52.891062425 +0000 UTC m=+0.244378443 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5) Feb 1 03:39:52 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:39:52 localhost podman[92546]: 2026-02-01 08:39:52.928528624 +0000 UTC m=+0.281844662 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:10:15Z) Feb 1 03:39:52 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:39:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:39:58 localhost systemd[1]: tmp-crun.koEcfp.mount: Deactivated successfully. Feb 1 03:39:58 localhost podman[92617]: 2026-02-01 08:39:58.755839468 +0000 UTC m=+0.115125021 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:39:59 localhost podman[92617]: 2026-02-01 08:39:59.099735631 +0000 UTC m=+0.459021144 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:39:59 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:40:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:40:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:40:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:40:01 localhost systemd[1]: tmp-crun.hrnej0.mount: Deactivated successfully. Feb 1 03:40:01 localhost podman[92641]: 2026-02-01 08:40:01.725743149 +0000 UTC m=+0.077518458 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team) Feb 1 03:40:01 localhost podman[92640]: 2026-02-01 08:40:01.792502746 +0000 UTC m=+0.148662760 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:40:01 localhost podman[92641]: 2026-02-01 08:40:01.809509447 +0000 UTC m=+0.161284786 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Feb 1 03:40:01 localhost podman[92641]: unhealthy Feb 1 03:40:01 localhost podman[92642]: 2026-02-01 08:40:01.764547399 +0000 UTC m=+0.109957293 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 1 03:40:01 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:40:01 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:40:01 localhost podman[92642]: 2026-02-01 08:40:01.8503868 +0000 UTC m=+0.195796694 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, version=17.1.13, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true) Feb 1 03:40:01 localhost podman[92642]: unhealthy Feb 1 03:40:01 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:40:01 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:40:02 localhost podman[92640]: 2026-02-01 08:40:02.017842404 +0000 UTC m=+0.374002418 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, release=1766032510, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:40:02 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:40:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:40:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:40:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:40:14 localhost podman[92709]: 2026-02-01 08:40:14.723342797 +0000 UTC m=+0.082247052 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:40:14 localhost systemd[1]: tmp-crun.3xjMDn.mount: Deactivated successfully. Feb 1 03:40:14 localhost podman[92708]: 2026-02-01 08:40:14.783685268 +0000 UTC m=+0.144210663 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, container_name=collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:40:14 localhost podman[92708]: 2026-02-01 08:40:14.789923539 +0000 UTC m=+0.150448924 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:40:14 localhost podman[92709]: 2026-02-01 08:40:14.802052741 +0000 UTC m=+0.160956936 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 1 03:40:14 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:40:14 localhost systemd[1]: tmp-crun.Dbe1QF.mount: Deactivated successfully. Feb 1 03:40:14 localhost podman[92710]: 2026-02-01 08:40:14.835920979 +0000 UTC m=+0.188617634 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:40:14 localhost podman[92710]: 2026-02-01 08:40:14.845484172 +0000 UTC m=+0.198180827 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:40:14 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:40:14 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:40:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:40:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:40:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:40:23 localhost systemd[1]: tmp-crun.MVTRbR.mount: Deactivated successfully. Feb 1 03:40:23 localhost podman[92779]: 2026-02-01 08:40:23.741682292 +0000 UTC m=+0.085309937 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible) Feb 1 03:40:23 localhost podman[92779]: 2026-02-01 08:40:23.769868426 +0000 UTC m=+0.113496051 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, architecture=x86_64, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 1 03:40:23 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:40:23 localhost systemd[1]: tmp-crun.BehbR3.mount: Deactivated successfully. Feb 1 03:40:23 localhost podman[92778]: 2026-02-01 08:40:23.854307344 +0000 UTC m=+0.203465619 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true) Feb 1 03:40:23 localhost podman[92778]: 2026-02-01 08:40:23.865408325 +0000 UTC m=+0.214566660 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, distribution-scope=public, container_name=logrotate_crond, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:40:23 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:40:23 localhost podman[92777]: 2026-02-01 08:40:23.946563783 +0000 UTC m=+0.298043069 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 1 03:40:23 localhost podman[92777]: 2026-02-01 08:40:23.972392504 +0000 UTC m=+0.323871770 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:40:23 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:40:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:40:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:40:29 localhost recover_tripleo_nova_virtqemud[92852]: 61284 Feb 1 03:40:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:40:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:40:29 localhost systemd[1]: tmp-crun.iXul8R.mount: Deactivated successfully. Feb 1 03:40:29 localhost podman[92850]: 2026-02-01 08:40:29.718288881 +0000 UTC m=+0.070831843 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team) Feb 1 03:40:30 localhost podman[92850]: 2026-02-01 08:40:30.076366759 +0000 UTC m=+0.428909731 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=nova_migration_target) Feb 1 03:40:30 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:40:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:40:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:40:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:40:32 localhost systemd[1]: tmp-crun.UauKh2.mount: Deactivated successfully. Feb 1 03:40:32 localhost podman[92876]: 2026-02-01 08:40:32.726171916 +0000 UTC m=+0.088272898 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4) Feb 1 03:40:32 localhost podman[92875]: 2026-02-01 08:40:32.698418115 +0000 UTC m=+0.065180689 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, container_name=metrics_qdr) Feb 1 03:40:32 localhost podman[92876]: 2026-02-01 08:40:32.763467479 +0000 UTC m=+0.125568451 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:40:32 localhost podman[92876]: unhealthy Feb 1 03:40:32 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:40:32 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:40:32 localhost podman[92877]: 2026-02-01 08:40:32.841369178 +0000 UTC m=+0.200541320 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_controller, version=17.1.13, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:40:32 localhost podman[92877]: 2026-02-01 08:40:32.857892094 +0000 UTC m=+0.217064266 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-type=git, container_name=ovn_controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 1 03:40:32 localhost podman[92877]: unhealthy Feb 1 03:40:32 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:40:32 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:40:32 localhost podman[92875]: 2026-02-01 08:40:32.915203461 +0000 UTC m=+0.281966065 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:40:32 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:40:33 localhost systemd[1]: tmp-crun.0PCQUq.mount: Deactivated successfully. Feb 1 03:40:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:40:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:40:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:40:45 localhost podman[93072]: 2026-02-01 08:40:45.393861742 +0000 UTC m=+0.085696759 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:40:45 localhost podman[93072]: 2026-02-01 08:40:45.40293812 +0000 UTC m=+0.094773127 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5) Feb 1 03:40:45 localhost systemd[1]: tmp-crun.3bVg0F.mount: Deactivated successfully. Feb 1 03:40:45 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:40:45 localhost podman[93070]: 2026-02-01 08:40:45.431141914 +0000 UTC m=+0.120535166 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, container_name=collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, distribution-scope=public) Feb 1 03:40:45 localhost podman[93071]: 2026-02-01 08:40:45.48514003 +0000 UTC m=+0.175419929 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_compute, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:40:45 localhost podman[93071]: 2026-02-01 08:40:45.513390676 +0000 UTC m=+0.203670595 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:40:45 localhost podman[93070]: 2026-02-01 08:40:45.519409861 +0000 UTC m=+0.208803113 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:40:45 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:40:45 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 5701 writes, 25K keys, 5701 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5701 writes, 740 syncs, 7.70 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 4896 writes, 22K keys, 4896 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4896 writes, 685 syncs, 7.15 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:40:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:40:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:40:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:40:54 localhost systemd[1]: tmp-crun.Tq795v.mount: Deactivated successfully. Feb 1 03:40:54 localhost podman[93134]: 2026-02-01 08:40:54.727273564 +0000 UTC m=+0.088213235 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z) Feb 1 03:40:54 localhost podman[93136]: 2026-02-01 08:40:54.790202734 +0000 UTC m=+0.142652985 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:40:54 localhost podman[93135]: 2026-02-01 08:40:54.756190741 +0000 UTC m=+0.111079107 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z) Feb 1 03:40:54 localhost podman[93136]: 2026-02-01 08:40:54.822426161 +0000 UTC m=+0.174876452 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 1 03:40:54 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:40:54 localhost podman[93135]: 2026-02-01 08:40:54.837557186 +0000 UTC m=+0.192445582 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:40:54 localhost podman[93134]: 2026-02-01 08:40:54.844719155 +0000 UTC m=+0.205658826 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:40:54 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:40:54 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:41:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:41:00 localhost systemd[1]: tmp-crun.GHLBSh.mount: Deactivated successfully. Feb 1 03:41:00 localhost podman[93207]: 2026-02-01 08:41:00.7089265 +0000 UTC m=+0.073507916 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 1 03:41:01 localhost podman[93207]: 2026-02-01 08:41:01.073417724 +0000 UTC m=+0.437999070 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 1 03:41:01 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:41:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:41:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:41:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:41:03 localhost systemd[1]: tmp-crun.r6m3BS.mount: Deactivated successfully. Feb 1 03:41:03 localhost podman[93231]: 2026-02-01 08:41:03.780878638 +0000 UTC m=+0.136723883 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1766032510, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:41:03 localhost podman[93232]: 2026-02-01 08:41:03.828712105 +0000 UTC m=+0.182589259 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:41:03 localhost podman[93232]: 2026-02-01 08:41:03.843327694 +0000 UTC m=+0.197204838 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:41:03 localhost podman[93232]: unhealthy Feb 1 03:41:03 localhost podman[93230]: 2026-02-01 08:41:03.748192707 +0000 UTC m=+0.107020793 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:41:03 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:41:03 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:41:03 localhost podman[93231]: 2026-02-01 08:41:03.902360533 +0000 UTC m=+0.258205788 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:41:03 localhost podman[93231]: unhealthy Feb 1 03:41:03 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:41:03 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:41:03 localhost podman[93230]: 2026-02-01 08:41:03.986532464 +0000 UTC m=+0.345360520 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc.) Feb 1 03:41:03 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:41:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:41:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:41:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:41:15 localhost systemd[1]: tmp-crun.kuzEqd.mount: Deactivated successfully. Feb 1 03:41:15 localhost podman[93295]: 2026-02-01 08:41:15.727375667 +0000 UTC m=+0.088327319 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Feb 1 03:41:15 localhost podman[93296]: 2026-02-01 08:41:15.783835658 +0000 UTC m=+0.138584080 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, config_id=tripleo_step3, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git) Feb 1 03:41:15 localhost podman[93295]: 2026-02-01 08:41:15.785257911 +0000 UTC m=+0.146209543 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, architecture=x86_64, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=) Feb 1 03:41:15 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:41:15 localhost podman[93294]: 2026-02-01 08:41:15.832281403 +0000 UTC m=+0.193145983 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public) Feb 1 03:41:15 localhost podman[93294]: 2026-02-01 08:41:15.843652681 +0000 UTC m=+0.204517231 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible) Feb 1 03:41:15 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:41:15 localhost podman[93296]: 2026-02-01 08:41:15.868364089 +0000 UTC m=+0.223112511 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z) Feb 1 03:41:15 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:41:16 localhost systemd[1]: tmp-crun.9GizBW.mount: Deactivated successfully. Feb 1 03:41:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:41:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:41:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:41:25 localhost podman[93357]: 2026-02-01 08:41:25.688896929 +0000 UTC m=+0.057925307 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:41:25 localhost systemd[1]: tmp-crun.8Wipxz.mount: Deactivated successfully. Feb 1 03:41:25 localhost podman[93359]: 2026-02-01 08:41:25.739978385 +0000 UTC m=+0.099439300 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc.) Feb 1 03:41:25 localhost podman[93357]: 2026-02-01 08:41:25.763899928 +0000 UTC m=+0.132928346 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_compute, release=1766032510, version=17.1.13, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:41:25 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:41:25 localhost podman[93359]: 2026-02-01 08:41:25.790463463 +0000 UTC m=+0.149924398 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public) Feb 1 03:41:25 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:41:25 localhost podman[93358]: 2026-02-01 08:41:25.87259811 +0000 UTC m=+0.234894922 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:41:25 localhost podman[93358]: 2026-02-01 08:41:25.909407879 +0000 UTC m=+0.271704671 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible) Feb 1 03:41:25 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:41:30 localhost sshd[93429]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:41:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:41:31 localhost systemd[1]: tmp-crun.a6KQxe.mount: Deactivated successfully. Feb 1 03:41:31 localhost podman[93431]: 2026-02-01 08:41:31.52625987 +0000 UTC m=+0.095220300 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:41:31 localhost podman[93431]: 2026-02-01 08:41:31.924552031 +0000 UTC m=+0.493512391 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 1 03:41:31 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:41:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:41:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:41:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:41:34 localhost systemd[1]: tmp-crun.LTDIld.mount: Deactivated successfully. Feb 1 03:41:34 localhost systemd[1]: tmp-crun.wRAuhp.mount: Deactivated successfully. Feb 1 03:41:34 localhost podman[93456]: 2026-02-01 08:41:34.79295887 +0000 UTC m=+0.149826124 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=) Feb 1 03:41:34 localhost podman[93455]: 2026-02-01 08:41:34.748114065 +0000 UTC m=+0.107175537 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:41:34 localhost podman[93456]: 2026-02-01 08:41:34.804975379 +0000 UTC m=+0.161842643 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:41:34 localhost podman[93456]: unhealthy Feb 1 03:41:34 localhost podman[93457]: 2026-02-01 08:41:34.762130585 +0000 UTC m=+0.114794061 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, container_name=ovn_controller, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team) Feb 1 03:41:34 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:41:34 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:41:34 localhost podman[93457]: 2026-02-01 08:41:34.841184449 +0000 UTC m=+0.193847995 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64) Feb 1 03:41:34 localhost podman[93457]: unhealthy Feb 1 03:41:34 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:41:34 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:41:34 localhost podman[93455]: 2026-02-01 08:41:34.963447637 +0000 UTC m=+0.322509079 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, distribution-scope=public, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true) Feb 1 03:41:34 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:41:43 localhost sshd[93523]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:41:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:41:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:41:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:41:46 localhost systemd[1]: tmp-crun.Jy6Yqo.mount: Deactivated successfully. Feb 1 03:41:46 localhost podman[93605]: 2026-02-01 08:41:46.710503469 +0000 UTC m=+0.077447205 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 1 03:41:46 localhost podman[93605]: 2026-02-01 08:41:46.724424766 +0000 UTC m=+0.091368502 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true) Feb 1 03:41:46 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:41:46 localhost podman[93604]: 2026-02-01 08:41:46.81883505 +0000 UTC m=+0.185966152 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z) Feb 1 03:41:46 localhost systemd[1]: tmp-crun.G9buhI.mount: Deactivated successfully. Feb 1 03:41:46 localhost podman[93604]: 2026-02-01 08:41:46.88343225 +0000 UTC m=+0.250563362 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 1 03:41:46 localhost podman[93603]: 2026-02-01 08:41:46.886285088 +0000 UTC m=+0.253853663 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vendor=Red Hat, Inc.) Feb 1 03:41:46 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:41:46 localhost podman[93603]: 2026-02-01 08:41:46.976600777 +0000 UTC m=+0.344169262 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13) Feb 1 03:41:46 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:41:47 localhost podman[93723]: Feb 1 03:41:47 localhost podman[93723]: 2026-02-01 08:41:47.326182634 +0000 UTC m=+0.075844236 container create 6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_mahavira, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, version=7, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=) Feb 1 03:41:47 localhost systemd[1]: Started libpod-conmon-6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b.scope. Feb 1 03:41:47 localhost systemd[1]: Started libcrun container. Feb 1 03:41:47 localhost podman[93723]: 2026-02-01 08:41:47.296038011 +0000 UTC m=+0.045699673 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 03:41:47 localhost podman[93723]: 2026-02-01 08:41:47.408033464 +0000 UTC m=+0.157695066 container init 6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_mahavira, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=1764794109, name=rhceph, architecture=x86_64, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 03:41:47 localhost podman[93723]: 2026-02-01 08:41:47.41702963 +0000 UTC m=+0.166691232 container start 6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_mahavira, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1764794109, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.buildah.version=1.41.4, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z) Feb 1 03:41:47 localhost podman[93723]: 2026-02-01 08:41:47.417431462 +0000 UTC m=+0.167093064 container attach 6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_mahavira, distribution-scope=public, maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 03:41:47 localhost elastic_mahavira[93740]: 167 167 Feb 1 03:41:47 localhost systemd[1]: libpod-6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b.scope: Deactivated successfully. Feb 1 03:41:47 localhost podman[93745]: 2026-02-01 08:41:47.497402424 +0000 UTC m=+0.060166446 container died 6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_mahavira, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, ceph=True, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1764794109) Feb 1 03:41:47 localhost podman[93745]: 2026-02-01 08:41:47.533848172 +0000 UTC m=+0.096612184 container remove 6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_mahavira, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, RELEASE=main, architecture=x86_64, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, distribution-scope=public, build-date=2025-12-08T17:28:53Z, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 03:41:47 localhost systemd[1]: libpod-conmon-6486cfce01f05fd738a48a0e369b2b78190d7a417325d1c467069acf3e7ba81b.scope: Deactivated successfully. Feb 1 03:41:47 localhost podman[93765]: Feb 1 03:41:47 localhost podman[93765]: 2026-02-01 08:41:47.770686432 +0000 UTC m=+0.065050545 container create f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bardeen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 03:41:47 localhost systemd[1]: Started libpod-conmon-f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5.scope. Feb 1 03:41:47 localhost systemd[1]: Started libcrun container. Feb 1 03:41:47 localhost podman[93765]: 2026-02-01 08:41:47.739453595 +0000 UTC m=+0.033817728 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 03:41:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10df459718939cb67ee0f5a9fe449a94e571e368254e51cb87edb575de6e2fdc/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 03:41:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10df459718939cb67ee0f5a9fe449a94e571e368254e51cb87edb575de6e2fdc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 03:41:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10df459718939cb67ee0f5a9fe449a94e571e368254e51cb87edb575de6e2fdc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 03:41:47 localhost podman[93765]: 2026-02-01 08:41:47.849270431 +0000 UTC m=+0.143634534 container init f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bardeen, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, version=7, ceph=True, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:41:47 localhost podman[93765]: 2026-02-01 08:41:47.87172971 +0000 UTC m=+0.166093823 container start f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bardeen, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public) Feb 1 03:41:47 localhost podman[93765]: 2026-02-01 08:41:47.872193124 +0000 UTC m=+0.166557267 container attach f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bardeen, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 03:41:48 localhost systemd[1]: tmp-crun.ycylgw.mount: Deactivated successfully. Feb 1 03:41:48 localhost elastic_bardeen[93780]: [ Feb 1 03:41:48 localhost elastic_bardeen[93780]: { Feb 1 03:41:48 localhost elastic_bardeen[93780]: "available": false, Feb 1 03:41:48 localhost elastic_bardeen[93780]: "ceph_device": false, Feb 1 03:41:48 localhost elastic_bardeen[93780]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "lsm_data": {}, Feb 1 03:41:48 localhost elastic_bardeen[93780]: "lvs": [], Feb 1 03:41:48 localhost elastic_bardeen[93780]: "path": "/dev/sr0", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "rejected_reasons": [ Feb 1 03:41:48 localhost elastic_bardeen[93780]: "Has a FileSystem", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "Insufficient space (<5GB)" Feb 1 03:41:48 localhost elastic_bardeen[93780]: ], Feb 1 03:41:48 localhost elastic_bardeen[93780]: "sys_api": { Feb 1 03:41:48 localhost elastic_bardeen[93780]: "actuators": null, Feb 1 03:41:48 localhost elastic_bardeen[93780]: "device_nodes": "sr0", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "human_readable_size": "482.00 KB", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "id_bus": "ata", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "model": "QEMU DVD-ROM", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "nr_requests": "2", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "partitions": {}, Feb 1 03:41:48 localhost elastic_bardeen[93780]: "path": "/dev/sr0", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "removable": "1", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "rev": "2.5+", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "ro": "0", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "rotational": "1", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "sas_address": "", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "sas_device_handle": "", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "scheduler_mode": "mq-deadline", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "sectors": 0, Feb 1 03:41:48 localhost elastic_bardeen[93780]: "sectorsize": "2048", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "size": 493568.0, Feb 1 03:41:48 localhost elastic_bardeen[93780]: "support_discard": "0", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "type": "disk", Feb 1 03:41:48 localhost elastic_bardeen[93780]: "vendor": "QEMU" Feb 1 03:41:48 localhost elastic_bardeen[93780]: } Feb 1 03:41:48 localhost elastic_bardeen[93780]: } Feb 1 03:41:48 localhost elastic_bardeen[93780]: ] Feb 1 03:41:48 localhost systemd[1]: libpod-f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5.scope: Deactivated successfully. Feb 1 03:41:48 localhost systemd[1]: libpod-f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5.scope: Consumed 1.141s CPU time. Feb 1 03:41:49 localhost podman[95637]: 2026-02-01 08:41:49.036922692 +0000 UTC m=+0.043401611 container died f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bardeen, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, ceph=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., version=7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 03:41:49 localhost systemd[1]: var-lib-containers-storage-overlay-10df459718939cb67ee0f5a9fe449a94e571e368254e51cb87edb575de6e2fdc-merged.mount: Deactivated successfully. Feb 1 03:41:49 localhost podman[95637]: 2026-02-01 08:41:49.079473137 +0000 UTC m=+0.085952026 container remove f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_bardeen, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, release=1764794109, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, architecture=x86_64, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True) Feb 1 03:41:49 localhost systemd[1]: libpod-conmon-f2d7c4ea783266dc6e9c54ae84cb2e09ec7c64a892c331b5d747a165783659c5.scope: Deactivated successfully. Feb 1 03:41:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:41:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:41:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:41:56 localhost systemd[1]: tmp-crun.xaUCUC.mount: Deactivated successfully. Feb 1 03:41:56 localhost podman[95667]: 2026-02-01 08:41:56.761979519 +0000 UTC m=+0.114218032 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, container_name=logrotate_crond, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:41:56 localhost podman[95667]: 2026-02-01 08:41:56.779344792 +0000 UTC m=+0.131583325 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron) Feb 1 03:41:56 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:41:56 localhost systemd[1]: tmp-crun.ncoNZc.mount: Deactivated successfully. Feb 1 03:41:56 localhost podman[95666]: 2026-02-01 08:41:56.856806796 +0000 UTC m=+0.208921486 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.openshift.expose-services=) Feb 1 03:41:56 localhost podman[95668]: 2026-02-01 08:41:56.899390432 +0000 UTC m=+0.245951501 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 1 03:41:56 localhost podman[95666]: 2026-02-01 08:41:56.913833175 +0000 UTC m=+0.265947865 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, container_name=ceilometer_agent_compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:41:56 localhost podman[95668]: 2026-02-01 08:41:56.926517713 +0000 UTC m=+0.273078792 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, tcib_managed=true, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, config_id=tripleo_step4) Feb 1 03:41:56 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:41:56 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:42:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:42:02 localhost podman[95741]: 2026-02-01 08:42:02.727866372 +0000 UTC m=+0.088125112 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, batch=17.1_20260112.1) Feb 1 03:42:03 localhost podman[95741]: 2026-02-01 08:42:03.115540508 +0000 UTC m=+0.475799208 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-type=git, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:42:03 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:42:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:42:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:42:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:42:05 localhost systemd[1]: tmp-crun.72uiB9.mount: Deactivated successfully. Feb 1 03:42:05 localhost podman[95764]: 2026-02-01 08:42:05.73388055 +0000 UTC m=+0.089537306 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team) Feb 1 03:42:05 localhost podman[95765]: 2026-02-01 08:42:05.781348236 +0000 UTC m=+0.129990306 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:42:05 localhost podman[95765]: 2026-02-01 08:42:05.798490991 +0000 UTC m=+0.147133101 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:42:05 localhost podman[95765]: unhealthy Feb 1 03:42:05 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:42:05 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:42:05 localhost podman[95766]: 2026-02-01 08:42:05.884905521 +0000 UTC m=+0.233193140 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:42:05 localhost podman[95766]: 2026-02-01 08:42:05.900259942 +0000 UTC m=+0.248547631 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, architecture=x86_64, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Feb 1 03:42:05 localhost podman[95766]: unhealthy Feb 1 03:42:05 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:42:05 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:42:05 localhost podman[95764]: 2026-02-01 08:42:05.962614003 +0000 UTC m=+0.318270829 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step1, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Feb 1 03:42:05 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:42:06 localhost systemd[1]: tmp-crun.TiHehS.mount: Deactivated successfully. Feb 1 03:42:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:42:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:42:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:42:17 localhost systemd[1]: tmp-crun.p9n38k.mount: Deactivated successfully. Feb 1 03:42:17 localhost podman[95834]: 2026-02-01 08:42:17.74310699 +0000 UTC m=+0.096241282 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, container_name=iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vcs-type=git, architecture=x86_64, url=https://www.redhat.com) Feb 1 03:42:17 localhost podman[95833]: 2026-02-01 08:42:17.772348557 +0000 UTC m=+0.128218733 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:42:17 localhost podman[95833]: 2026-02-01 08:42:17.793371391 +0000 UTC m=+0.149241557 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z) Feb 1 03:42:17 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:42:17 localhost podman[95834]: 2026-02-01 08:42:17.832567872 +0000 UTC m=+0.185702104 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, vcs-type=git, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:42:17 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:42:17 localhost podman[95832]: 2026-02-01 08:42:17.838215405 +0000 UTC m=+0.197445183 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, architecture=x86_64, release=1766032510) Feb 1 03:42:17 localhost podman[95832]: 2026-02-01 08:42:17.918199678 +0000 UTC m=+0.277429486 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, container_name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 1 03:42:17 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:42:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:42:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:42:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:42:27 localhost systemd[1]: tmp-crun.oenRf9.mount: Deactivated successfully. Feb 1 03:42:27 localhost podman[95899]: 2026-02-01 08:42:27.721769415 +0000 UTC m=+0.079966672 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=logrotate_crond, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 1 03:42:27 localhost podman[95898]: 2026-02-01 08:42:27.790168213 +0000 UTC m=+0.148082771 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:42:27 localhost podman[95900]: 2026-02-01 08:42:27.836028059 +0000 UTC m=+0.187145389 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, release=1766032510, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z) Feb 1 03:42:27 localhost podman[95898]: 2026-02-01 08:42:27.850488082 +0000 UTC m=+0.208402690 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, release=1766032510, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 1 03:42:27 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:42:27 localhost podman[95900]: 2026-02-01 08:42:27.865929436 +0000 UTC m=+0.217046766 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 1 03:42:27 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:42:27 localhost podman[95899]: 2026-02-01 08:42:27.906650643 +0000 UTC m=+0.264847910 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, release=1766032510, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 1 03:42:27 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:42:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:42:29 localhost recover_tripleo_nova_virtqemud[95974]: 61284 Feb 1 03:42:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:42:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:42:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:42:33 localhost podman[95975]: 2026-02-01 08:42:33.72423699 +0000 UTC m=+0.081500690 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_migration_target, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 1 03:42:34 localhost podman[95975]: 2026-02-01 08:42:34.116471205 +0000 UTC m=+0.473734935 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 1 03:42:34 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:42:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:42:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:42:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:42:36 localhost podman[95997]: 2026-02-01 08:42:36.715564648 +0000 UTC m=+0.076473166 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public) Feb 1 03:42:36 localhost systemd[1]: tmp-crun.YxyLyy.mount: Deactivated successfully. Feb 1 03:42:36 localhost podman[95999]: 2026-02-01 08:42:36.778382703 +0000 UTC m=+0.132012918 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc.) Feb 1 03:42:36 localhost podman[95998]: 2026-02-01 08:42:36.832310246 +0000 UTC m=+0.186534180 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:56:19Z, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:42:36 localhost podman[95998]: 2026-02-01 08:42:36.843458968 +0000 UTC m=+0.197682841 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64) Feb 1 03:42:36 localhost podman[95998]: unhealthy Feb 1 03:42:36 localhost podman[95999]: 2026-02-01 08:42:36.84967317 +0000 UTC m=+0.203303375 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:36:40Z, release=1766032510, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git) Feb 1 03:42:36 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:42:36 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:42:36 localhost podman[95999]: unhealthy Feb 1 03:42:36 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:42:36 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:42:36 localhost podman[95997]: 2026-02-01 08:42:36.949338265 +0000 UTC m=+0.310246763 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:42:36 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:42:42 localhost sshd[96064]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:42:43 localhost sshd[96065]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:42:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:42:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:42:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:42:48 localhost systemd[1]: tmp-crun.Zof8Jb.mount: Deactivated successfully. Feb 1 03:42:48 localhost podman[96066]: 2026-02-01 08:42:48.747582084 +0000 UTC m=+0.107941031 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, container_name=collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible) Feb 1 03:42:48 localhost podman[96067]: 2026-02-01 08:42:48.767429782 +0000 UTC m=+0.120609838 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=nova_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5) Feb 1 03:42:48 localhost podman[96067]: 2026-02-01 08:42:48.818455876 +0000 UTC m=+0.171635922 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:42:48 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:42:48 localhost podman[96066]: 2026-02-01 08:42:48.831053202 +0000 UTC m=+0.191412069 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 1 03:42:48 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:42:48 localhost podman[96068]: 2026-02-01 08:42:48.874245267 +0000 UTC m=+0.226108473 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:42:48 localhost podman[96068]: 2026-02-01 08:42:48.926403266 +0000 UTC m=+0.278266452 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.13, tcib_managed=true, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team) Feb 1 03:42:48 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:42:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:42:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:42:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:42:58 localhost systemd[1]: tmp-crun.7usYzH.mount: Deactivated successfully. Feb 1 03:42:58 localhost podman[96253]: 2026-02-01 08:42:58.770630283 +0000 UTC m=+0.122151435 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 1 03:42:58 localhost podman[96252]: 2026-02-01 08:42:58.79249245 +0000 UTC m=+0.148054197 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510) Feb 1 03:42:58 localhost podman[96253]: 2026-02-01 08:42:58.803533472 +0000 UTC m=+0.155054554 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5) Feb 1 03:42:58 localhost podman[96252]: 2026-02-01 08:42:58.813864412 +0000 UTC m=+0.169426129 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1766032510) Feb 1 03:42:58 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:42:58 localhost podman[96254]: 2026-02-01 08:42:58.848907447 +0000 UTC m=+0.194244428 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Feb 1 03:42:58 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:42:58 localhost podman[96254]: 2026-02-01 08:42:58.879875538 +0000 UTC m=+0.225212559 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 1 03:42:58 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:42:59 localhost systemd[1]: tmp-crun.euLGlK.mount: Deactivated successfully. Feb 1 03:43:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:43:04 localhost systemd[1]: tmp-crun.delOab.mount: Deactivated successfully. Feb 1 03:43:04 localhost podman[96325]: 2026-02-01 08:43:04.739249161 +0000 UTC m=+0.097013916 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=) Feb 1 03:43:05 localhost podman[96325]: 2026-02-01 08:43:05.116657473 +0000 UTC m=+0.474422218 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, architecture=x86_64, config_id=tripleo_step4, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 1 03:43:05 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:43:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:43:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:43:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:43:07 localhost systemd[1]: tmp-crun.Vi3Sbf.mount: Deactivated successfully. Feb 1 03:43:07 localhost podman[96351]: 2026-02-01 08:43:07.713901061 +0000 UTC m=+0.072728704 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, vcs-type=git) Feb 1 03:43:07 localhost podman[96351]: 2026-02-01 08:43:07.748897234 +0000 UTC m=+0.107724847 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public) Feb 1 03:43:07 localhost podman[96351]: unhealthy Feb 1 03:43:07 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:43:07 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:43:07 localhost podman[96349]: 2026-02-01 08:43:07.76813078 +0000 UTC m=+0.126835710 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:43:07 localhost podman[96350]: 2026-02-01 08:43:07.818339446 +0000 UTC m=+0.174532528 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git) Feb 1 03:43:07 localhost podman[96350]: 2026-02-01 08:43:07.832391261 +0000 UTC m=+0.188584283 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 1 03:43:07 localhost podman[96350]: unhealthy Feb 1 03:43:07 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:43:07 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:43:07 localhost podman[96349]: 2026-02-01 08:43:07.970258462 +0000 UTC m=+0.328963382 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:43:07 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:43:08 localhost systemd[1]: tmp-crun.zeQ5CA.mount: Deactivated successfully. Feb 1 03:43:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:43:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:43:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:43:19 localhost systemd[1]: tmp-crun.E3Ddcz.mount: Deactivated successfully. Feb 1 03:43:19 localhost systemd[1]: tmp-crun.UK6StB.mount: Deactivated successfully. Feb 1 03:43:19 localhost podman[96421]: 2026-02-01 08:43:19.787290224 +0000 UTC m=+0.139593145 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 1 03:43:19 localhost podman[96421]: 2026-02-01 08:43:19.795176699 +0000 UTC m=+0.147479620 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:43:19 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:43:19 localhost podman[96420]: 2026-02-01 08:43:19.822241287 +0000 UTC m=+0.175328742 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13) Feb 1 03:43:19 localhost podman[96419]: 2026-02-01 08:43:19.750881167 +0000 UTC m=+0.106238013 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:43:19 localhost podman[96420]: 2026-02-01 08:43:19.870434199 +0000 UTC m=+0.223521694 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z) Feb 1 03:43:19 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:43:19 localhost podman[96419]: 2026-02-01 08:43:19.881569404 +0000 UTC m=+0.236926150 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:43:19 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:43:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:43:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:43:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:43:29 localhost systemd[1]: tmp-crun.VzmhIp.mount: Deactivated successfully. Feb 1 03:43:29 localhost podman[96486]: 2026-02-01 08:43:29.701259723 +0000 UTC m=+0.057844773 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:43:29 localhost podman[96485]: 2026-02-01 08:43:29.751465079 +0000 UTC m=+0.108818423 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-cron, vcs-type=git) Feb 1 03:43:29 localhost podman[96485]: 2026-02-01 08:43:29.762334235 +0000 UTC m=+0.119687609 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z) Feb 1 03:43:29 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:43:29 localhost podman[96486]: 2026-02-01 08:43:29.775326277 +0000 UTC m=+0.131911307 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1) Feb 1 03:43:29 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:43:29 localhost podman[96484]: 2026-02-01 08:43:29.859393211 +0000 UTC m=+0.219097648 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:43:29 localhost podman[96484]: 2026-02-01 08:43:29.888303347 +0000 UTC m=+0.248007724 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:43:29 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:43:30 localhost systemd[1]: tmp-crun.07tDaj.mount: Deactivated successfully. Feb 1 03:43:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:43:35 localhost systemd[1]: tmp-crun.rkaBsW.mount: Deactivated successfully. Feb 1 03:43:35 localhost podman[96554]: 2026-02-01 08:43:35.73896607 +0000 UTC m=+0.098243025 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510) Feb 1 03:43:36 localhost podman[96554]: 2026-02-01 08:43:36.112298635 +0000 UTC m=+0.471575590 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Feb 1 03:43:36 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:43:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:43:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:43:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:43:38 localhost podman[96578]: 2026-02-01 08:43:38.737409906 +0000 UTC m=+0.087629405 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:43:38 localhost podman[96578]: 2026-02-01 08:43:38.749650506 +0000 UTC m=+0.099869995 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20260112.1, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:43:38 localhost podman[96578]: unhealthy Feb 1 03:43:38 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:43:38 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:43:38 localhost systemd[1]: tmp-crun.qwMXdE.mount: Deactivated successfully. Feb 1 03:43:38 localhost podman[96577]: 2026-02-01 08:43:38.796183567 +0000 UTC m=+0.147891813 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:43:38 localhost podman[96579]: 2026-02-01 08:43:38.84405446 +0000 UTC m=+0.190281266 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:43:38 localhost podman[96579]: 2026-02-01 08:43:38.886312199 +0000 UTC m=+0.232538965 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:43:38 localhost podman[96579]: unhealthy Feb 1 03:43:38 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:43:38 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:43:39 localhost podman[96577]: 2026-02-01 08:43:39.080592398 +0000 UTC m=+0.432300624 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:43:39 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:43:45 localhost sshd[96646]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:43:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:43:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:43:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:43:50 localhost systemd[1]: tmp-crun.4mQIIU.mount: Deactivated successfully. Feb 1 03:43:50 localhost podman[96650]: 2026-02-01 08:43:50.747086674 +0000 UTC m=+0.095544051 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, container_name=iscsid, vcs-type=git) Feb 1 03:43:50 localhost podman[96650]: 2026-02-01 08:43:50.76246102 +0000 UTC m=+0.110918487 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:43:50 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:43:50 localhost podman[96649]: 2026-02-01 08:43:50.847773073 +0000 UTC m=+0.197955793 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, distribution-scope=public, release=1766032510) Feb 1 03:43:50 localhost podman[96648]: 2026-02-01 08:43:50.905820061 +0000 UTC m=+0.257550539 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, container_name=collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:43:50 localhost podman[96649]: 2026-02-01 08:43:50.927096701 +0000 UTC m=+0.277279381 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:43:50 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:43:50 localhost podman[96648]: 2026-02-01 08:43:50.942288311 +0000 UTC m=+0.294018829 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, container_name=collectd, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Feb 1 03:43:50 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:43:51 localhost systemd[1]: tmp-crun.AsluT7.mount: Deactivated successfully. Feb 1 03:44:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:44:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:44:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:44:00 localhost systemd[1]: tmp-crun.pdmWGD.mount: Deactivated successfully. Feb 1 03:44:00 localhost podman[96790]: 2026-02-01 08:44:00.79364793 +0000 UTC m=+0.151793874 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, release=1766032510, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:44:00 localhost podman[96792]: 2026-02-01 08:44:00.798169529 +0000 UTC m=+0.149221973 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, container_name=ceilometer_agent_ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:44:00 localhost podman[96791]: 2026-02-01 08:44:00.757610603 +0000 UTC m=+0.115400386 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, container_name=logrotate_crond, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:44:00 localhost podman[96790]: 2026-02-01 08:44:00.824717872 +0000 UTC m=+0.182863816 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute) Feb 1 03:44:00 localhost podman[96792]: 2026-02-01 08:44:00.833290567 +0000 UTC m=+0.184343031 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public) Feb 1 03:44:00 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:44:00 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:44:00 localhost podman[96791]: 2026-02-01 08:44:00.842814722 +0000 UTC m=+0.200604465 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, version=17.1.13, name=rhosp-rhel9/openstack-cron, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 1 03:44:00 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:44:01 localhost systemd[1]: tmp-crun.noaEHc.mount: Deactivated successfully. Feb 1 03:44:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:44:06 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:44:06 localhost recover_tripleo_nova_virtqemud[96870]: 61284 Feb 1 03:44:06 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:44:06 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:44:06 localhost podman[96863]: 2026-02-01 08:44:06.725530567 +0000 UTC m=+0.080826505 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:44:07 localhost podman[96863]: 2026-02-01 08:44:07.153678991 +0000 UTC m=+0.508974949 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true) Feb 1 03:44:07 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:44:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:44:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:44:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:44:09 localhost podman[96890]: 2026-02-01 08:44:09.737154843 +0000 UTC m=+0.086941745 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, tcib_managed=true) Feb 1 03:44:09 localhost systemd[1]: tmp-crun.BaWNCG.mount: Deactivated successfully. Feb 1 03:44:09 localhost podman[96889]: 2026-02-01 08:44:09.75641928 +0000 UTC m=+0.106899713 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:44:09 localhost podman[96890]: 2026-02-01 08:44:09.77645146 +0000 UTC m=+0.126238362 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:44:09 localhost podman[96890]: unhealthy Feb 1 03:44:09 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:44:09 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:44:09 localhost podman[96891]: 2026-02-01 08:44:09.855286502 +0000 UTC m=+0.199180711 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 1 03:44:09 localhost podman[96891]: 2026-02-01 08:44:09.895405786 +0000 UTC m=+0.239300055 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:44:09 localhost podman[96891]: unhealthy Feb 1 03:44:09 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:44:09 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:44:09 localhost podman[96889]: 2026-02-01 08:44:09.934273479 +0000 UTC m=+0.284753932 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:10:14Z, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true) Feb 1 03:44:09 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:44:10 localhost systemd[1]: tmp-crun.SyCtM2.mount: Deactivated successfully. Feb 1 03:44:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:44:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:44:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:44:21 localhost systemd[1]: tmp-crun.3OC7He.mount: Deactivated successfully. Feb 1 03:44:21 localhost podman[96957]: 2026-02-01 08:44:21.797293612 +0000 UTC m=+0.153035951 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_compute, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:44:21 localhost podman[96956]: 2026-02-01 08:44:21.750930567 +0000 UTC m=+0.106807011 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 1 03:44:21 localhost podman[96958]: 2026-02-01 08:44:21.83175983 +0000 UTC m=+0.185988982 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public) Feb 1 03:44:21 localhost podman[96956]: 2026-02-01 08:44:21.835318791 +0000 UTC m=+0.191195195 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Feb 1 03:44:21 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:44:21 localhost podman[96957]: 2026-02-01 08:44:21.854408862 +0000 UTC m=+0.210151231 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:44:21 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:44:21 localhost podman[96958]: 2026-02-01 08:44:21.875947449 +0000 UTC m=+0.230176581 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:44:21 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:44:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:44:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:44:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:44:31 localhost systemd[1]: tmp-crun.RPPTfy.mount: Deactivated successfully. Feb 1 03:44:31 localhost podman[97020]: 2026-02-01 08:44:31.740467297 +0000 UTC m=+0.090812965 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible) Feb 1 03:44:31 localhost systemd[1]: tmp-crun.YdvdJq.mount: Deactivated successfully. Feb 1 03:44:31 localhost podman[97019]: 2026-02-01 08:44:31.789181566 +0000 UTC m=+0.140852844 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:44:31 localhost podman[97019]: 2026-02-01 08:44:31.798473343 +0000 UTC m=+0.150144631 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:44:31 localhost podman[97018]: 2026-02-01 08:44:31.842794247 +0000 UTC m=+0.198606104 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1) Feb 1 03:44:31 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:44:31 localhost podman[97020]: 2026-02-01 08:44:31.875435968 +0000 UTC m=+0.225781626 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:44:31 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:44:31 localhost podman[97018]: 2026-02-01 08:44:31.926432428 +0000 UTC m=+0.282244285 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:44:31 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:44:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:44:37 localhost podman[97092]: 2026-02-01 08:44:37.729449104 +0000 UTC m=+0.092743854 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_migration_target, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step4) Feb 1 03:44:38 localhost podman[97092]: 2026-02-01 08:44:38.132236542 +0000 UTC m=+0.495531272 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:44:38 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:44:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:44:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:44:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:44:40 localhost systemd[1]: tmp-crun.gL5exy.mount: Deactivated successfully. Feb 1 03:44:40 localhost podman[97115]: 2026-02-01 08:44:40.740583794 +0000 UTC m=+0.096814850 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:44:40 localhost podman[97116]: 2026-02-01 08:44:40.799058306 +0000 UTC m=+0.153728984 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510) Feb 1 03:44:40 localhost podman[97116]: 2026-02-01 08:44:40.842331917 +0000 UTC m=+0.197002565 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, build-date=2026-01-12T22:56:19Z, version=17.1.13, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:44:40 localhost podman[97116]: unhealthy Feb 1 03:44:40 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:44:40 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:44:40 localhost podman[97117]: 2026-02-01 08:44:40.847950321 +0000 UTC m=+0.198310205 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, release=1766032510, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:44:40 localhost podman[97117]: 2026-02-01 08:44:40.933468119 +0000 UTC m=+0.283828033 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., container_name=ovn_controller, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:44:40 localhost podman[97117]: unhealthy Feb 1 03:44:40 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:44:40 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:44:40 localhost podman[97115]: 2026-02-01 08:44:40.948854527 +0000 UTC m=+0.305085593 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:44:40 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:44:41 localhost systemd[1]: tmp-crun.Inz4vE.mount: Deactivated successfully. Feb 1 03:44:46 localhost sshd[97180]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:44:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:44:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:44:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:44:52 localhost podman[97182]: 2026-02-01 08:44:52.717215828 +0000 UTC m=+0.080936037 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, url=https://www.redhat.com) Feb 1 03:44:52 localhost systemd[1]: tmp-crun.gmolAW.mount: Deactivated successfully. Feb 1 03:44:52 localhost podman[97184]: 2026-02-01 08:44:52.726875268 +0000 UTC m=+0.081320860 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, container_name=iscsid, tcib_managed=true, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc.) Feb 1 03:44:52 localhost podman[97182]: 2026-02-01 08:44:52.727721854 +0000 UTC m=+0.091442073 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step3, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=) Feb 1 03:44:52 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:44:52 localhost podman[97184]: 2026-02-01 08:44:52.80960515 +0000 UTC m=+0.164050732 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 1 03:44:52 localhost podman[97183]: 2026-02-01 08:44:52.824873044 +0000 UTC m=+0.182086032 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:44:52 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:44:52 localhost podman[97183]: 2026-02-01 08:44:52.851071525 +0000 UTC m=+0.208284463 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:44:52 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:44:54 localhost systemd[1]: tmp-crun.6wpJlE.mount: Deactivated successfully. Feb 1 03:44:54 localhost podman[97348]: 2026-02-01 08:44:54.412268479 +0000 UTC m=+0.104139218 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Guillaume Abrioux , release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 03:44:54 localhost podman[97348]: 2026-02-01 08:44:54.559827809 +0000 UTC m=+0.251698448 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-12-08T17:28:53Z, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public) Feb 1 03:45:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:45:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:45:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:45:02 localhost systemd[1]: tmp-crun.b72QoJ.mount: Deactivated successfully. Feb 1 03:45:02 localhost podman[97496]: 2026-02-01 08:45:02.739434885 +0000 UTC m=+0.094416021 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Feb 1 03:45:02 localhost podman[97496]: 2026-02-01 08:45:02.773317101 +0000 UTC m=+0.128298177 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:45:02 localhost systemd[1]: tmp-crun.q9nYom.mount: Deactivated successfully. Feb 1 03:45:02 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:45:02 localhost podman[97494]: 2026-02-01 08:45:02.831932851 +0000 UTC m=+0.188039932 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:45:02 localhost podman[97495]: 2026-02-01 08:45:02.798326134 +0000 UTC m=+0.152750112 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true) Feb 1 03:45:02 localhost podman[97495]: 2026-02-01 08:45:02.882526063 +0000 UTC m=+0.236949981 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-type=git, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.13) Feb 1 03:45:02 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:45:02 localhost podman[97494]: 2026-02-01 08:45:02.938373638 +0000 UTC m=+0.294480779 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5) Feb 1 03:45:02 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:45:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:45:08 localhost systemd[1]: tmp-crun.c2j15x.mount: Deactivated successfully. Feb 1 03:45:08 localhost podman[97565]: 2026-02-01 08:45:08.723529914 +0000 UTC m=+0.084196400 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 1 03:45:09 localhost podman[97565]: 2026-02-01 08:45:09.089550287 +0000 UTC m=+0.450216623 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 1 03:45:09 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:45:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:45:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:45:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:45:11 localhost podman[97586]: 2026-02-01 08:45:11.723177198 +0000 UTC m=+0.081111225 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.13, vcs-type=git, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 1 03:45:11 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:45:11 localhost systemd[1]: tmp-crun.C0ALgl.mount: Deactivated successfully. Feb 1 03:45:11 localhost recover_tripleo_nova_virtqemud[97640]: 61284 Feb 1 03:45:11 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:45:11 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:45:11 localhost podman[97587]: 2026-02-01 08:45:11.789548508 +0000 UTC m=+0.140903812 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:45:11 localhost podman[97587]: 2026-02-01 08:45:11.800275569 +0000 UTC m=+0.151630883 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:45:11 localhost podman[97587]: unhealthy Feb 1 03:45:11 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:45:11 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:45:11 localhost podman[97588]: 2026-02-01 08:45:11.753320429 +0000 UTC m=+0.102562668 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:45:11 localhost podman[97588]: 2026-02-01 08:45:11.88288537 +0000 UTC m=+0.232127639 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller) Feb 1 03:45:11 localhost podman[97588]: unhealthy Feb 1 03:45:11 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:45:11 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:45:11 localhost podman[97586]: 2026-02-01 08:45:11.921444641 +0000 UTC m=+0.279378708 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step1) Feb 1 03:45:11 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:45:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:45:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:45:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:45:23 localhost systemd[1]: tmp-crun.fRiW4Q.mount: Deactivated successfully. Feb 1 03:45:23 localhost podman[97652]: 2026-02-01 08:45:23.752513545 +0000 UTC m=+0.105972024 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public) Feb 1 03:45:23 localhost podman[97652]: 2026-02-01 08:45:23.782634615 +0000 UTC m=+0.136093124 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, container_name=nova_compute, release=1766032510, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:45:23 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:45:23 localhost podman[97651]: 2026-02-01 08:45:23.84240368 +0000 UTC m=+0.200945826 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:45:23 localhost podman[97651]: 2026-02-01 08:45:23.856266588 +0000 UTC m=+0.214808774 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, container_name=collectd, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 1 03:45:23 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:45:23 localhost podman[97653]: 2026-02-01 08:45:23.79446134 +0000 UTC m=+0.145109332 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Feb 1 03:45:23 localhost podman[97653]: 2026-02-01 08:45:23.925539847 +0000 UTC m=+0.276187849 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:45:23 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:45:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:45:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:45:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:45:33 localhost systemd[1]: tmp-crun.iQmsgc.mount: Deactivated successfully. Feb 1 03:45:33 localhost podman[97717]: 2026-02-01 08:45:33.723075869 +0000 UTC m=+0.075859393 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, tcib_managed=true, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 1 03:45:33 localhost podman[97717]: 2026-02-01 08:45:33.735812843 +0000 UTC m=+0.088596387 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team) Feb 1 03:45:33 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:45:33 localhost systemd[1]: tmp-crun.V698cl.mount: Deactivated successfully. Feb 1 03:45:33 localhost podman[97718]: 2026-02-01 08:45:33.782226606 +0000 UTC m=+0.133771542 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi) Feb 1 03:45:33 localhost podman[97716]: 2026-02-01 08:45:33.837052179 +0000 UTC m=+0.195467397 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 1 03:45:33 localhost podman[97716]: 2026-02-01 08:45:33.88825168 +0000 UTC m=+0.246666908 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5) Feb 1 03:45:33 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:45:33 localhost podman[97718]: 2026-02-01 08:45:33.940455312 +0000 UTC m=+0.292000288 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:45:33 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:45:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:45:39 localhost systemd[1]: tmp-crun.lccSUc.mount: Deactivated successfully. Feb 1 03:45:39 localhost podman[97785]: 2026-02-01 08:45:39.747026758 +0000 UTC m=+0.111269406 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc.) Feb 1 03:45:40 localhost podman[97785]: 2026-02-01 08:45:40.134514854 +0000 UTC m=+0.498757532 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:45:40 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:45:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:45:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:45:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:45:42 localhost podman[97809]: 2026-02-01 08:45:42.72518906 +0000 UTC m=+0.083338355 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z) Feb 1 03:45:42 localhost podman[97808]: 2026-02-01 08:45:42.780209608 +0000 UTC m=+0.142174141 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 1 03:45:42 localhost podman[97809]: 2026-02-01 08:45:42.797346897 +0000 UTC m=+0.155496122 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1) Feb 1 03:45:42 localhost podman[97809]: unhealthy Feb 1 03:45:42 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:45:42 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:45:42 localhost podman[97810]: 2026-02-01 08:45:42.881837566 +0000 UTC m=+0.234423349 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4) Feb 1 03:45:42 localhost podman[97810]: 2026-02-01 08:45:42.89977673 +0000 UTC m=+0.252362513 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team) Feb 1 03:45:42 localhost podman[97810]: unhealthy Feb 1 03:45:42 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:45:42 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:45:42 localhost podman[97808]: 2026-02-01 08:45:42.985426274 +0000 UTC m=+0.347390847 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:45:42 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:45:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:45:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:45:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:45:54 localhost podman[97875]: 2026-02-01 08:45:54.722840227 +0000 UTC m=+0.080296790 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, tcib_managed=true, config_id=tripleo_step3) Feb 1 03:45:54 localhost podman[97875]: 2026-02-01 08:45:54.736328194 +0000 UTC m=+0.093784757 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team) Feb 1 03:45:54 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:45:54 localhost systemd[1]: tmp-crun.aYcknp.mount: Deactivated successfully. Feb 1 03:45:54 localhost podman[97877]: 2026-02-01 08:45:54.877757021 +0000 UTC m=+0.228717933 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 1 03:45:54 localhost podman[97876]: 2026-02-01 08:45:54.858433554 +0000 UTC m=+0.211745479 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:45:54 localhost podman[97876]: 2026-02-01 08:45:54.942754968 +0000 UTC m=+0.296066933 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step5, vendor=Red Hat, Inc.) Feb 1 03:45:54 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:45:54 localhost podman[97877]: 2026-02-01 08:45:54.961449265 +0000 UTC m=+0.312410177 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:45:54 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:45:55 localhost systemd[1]: tmp-crun.Vzz8JM.mount: Deactivated successfully. Feb 1 03:45:59 localhost sshd[98016]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:46:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:46:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:46:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:46:04 localhost systemd[1]: tmp-crun.zxa3Ix.mount: Deactivated successfully. Feb 1 03:46:04 localhost podman[98020]: 2026-02-01 08:46:04.737801763 +0000 UTC m=+0.093602831 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 1 03:46:04 localhost podman[98018]: 2026-02-01 08:46:04.720135297 +0000 UTC m=+0.076492252 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 1 03:46:04 localhost podman[98018]: 2026-02-01 08:46:04.799289421 +0000 UTC m=+0.155646376 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public) Feb 1 03:46:04 localhost podman[98020]: 2026-02-01 08:46:04.809011732 +0000 UTC m=+0.164812800 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, container_name=ceilometer_agent_ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:46:04 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:46:04 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:46:04 localhost podman[98019]: 2026-02-01 08:46:04.770859504 +0000 UTC m=+0.127920672 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=) Feb 1 03:46:04 localhost podman[98019]: 2026-02-01 08:46:04.854525047 +0000 UTC m=+0.211586235 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, tcib_managed=true, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container) Feb 1 03:46:04 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:46:05 localhost systemd[1]: tmp-crun.V1Xh4N.mount: Deactivated successfully. Feb 1 03:46:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:46:10 localhost podman[98088]: 2026-02-01 08:46:10.733237442 +0000 UTC m=+0.093140517 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:46:11 localhost podman[98088]: 2026-02-01 08:46:11.100361429 +0000 UTC m=+0.460264494 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:46:11 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:46:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:46:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:46:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:46:13 localhost systemd[1]: tmp-crun.PC2u0n.mount: Deactivated successfully. Feb 1 03:46:13 localhost podman[98113]: 2026-02-01 08:46:13.720195615 +0000 UTC m=+0.078185496 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr) Feb 1 03:46:13 localhost podman[98114]: 2026-02-01 08:46:13.748106247 +0000 UTC m=+0.101474235 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:46:13 localhost podman[98115]: 2026-02-01 08:46:13.772372025 +0000 UTC m=+0.126053573 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, release=1766032510) Feb 1 03:46:13 localhost podman[98114]: 2026-02-01 08:46:13.83241712 +0000 UTC m=+0.185785068 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Feb 1 03:46:13 localhost podman[98114]: unhealthy Feb 1 03:46:13 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:46:13 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:46:13 localhost podman[98115]: 2026-02-01 08:46:13.854346837 +0000 UTC m=+0.208028375 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:46:13 localhost podman[98115]: unhealthy Feb 1 03:46:13 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:46:13 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:46:13 localhost podman[98113]: 2026-02-01 08:46:13.939602659 +0000 UTC m=+0.297592600 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, vcs-type=git, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:46:13 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:46:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:46:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:46:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:46:25 localhost systemd[1]: tmp-crun.yymFkP.mount: Deactivated successfully. Feb 1 03:46:25 localhost systemd[1]: tmp-crun.xaKCUy.mount: Deactivated successfully. Feb 1 03:46:25 localhost podman[98182]: 2026-02-01 08:46:25.784620913 +0000 UTC m=+0.141286243 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute) Feb 1 03:46:25 localhost podman[98181]: 2026-02-01 08:46:25.743649619 +0000 UTC m=+0.102318101 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, container_name=collectd, tcib_managed=true) Feb 1 03:46:25 localhost podman[98181]: 2026-02-01 08:46:25.827468887 +0000 UTC m=+0.186137359 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd) Feb 1 03:46:25 localhost podman[98182]: 2026-02-01 08:46:25.837379643 +0000 UTC m=+0.194044963 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:46:25 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:46:25 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:46:25 localhost podman[98183]: 2026-02-01 08:46:25.844376319 +0000 UTC m=+0.194805987 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:46:25 localhost podman[98183]: 2026-02-01 08:46:25.931822499 +0000 UTC m=+0.282252157 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:46:25 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:46:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:46:29 localhost recover_tripleo_nova_virtqemud[98248]: 61284 Feb 1 03:46:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:46:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:46:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:46:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:46:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:46:35 localhost podman[98250]: 2026-02-01 08:46:35.717463663 +0000 UTC m=+0.075379389 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:46:35 localhost systemd[1]: tmp-crun.9hRxmp.mount: Deactivated successfully. Feb 1 03:46:35 localhost podman[98249]: 2026-02-01 08:46:35.741399822 +0000 UTC m=+0.097618805 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:46:35 localhost podman[98249]: 2026-02-01 08:46:35.76951237 +0000 UTC m=+0.125731433 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:46:35 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:46:35 localhost podman[98251]: 2026-02-01 08:46:35.784526983 +0000 UTC m=+0.136352751 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_ipmi, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 1 03:46:35 localhost podman[98250]: 2026-02-01 08:46:35.814211619 +0000 UTC m=+0.172127345 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, container_name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:46:35 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:46:35 localhost podman[98251]: 2026-02-01 08:46:35.866087082 +0000 UTC m=+0.217912790 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, architecture=x86_64, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public) Feb 1 03:46:35 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:46:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:46:41 localhost systemd[1]: tmp-crun.Hm7bAK.mount: Deactivated successfully. Feb 1 03:46:41 localhost podman[98322]: 2026-02-01 08:46:41.739556425 +0000 UTC m=+0.089227476 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:46:42 localhost podman[98322]: 2026-02-01 08:46:42.105312128 +0000 UTC m=+0.454983169 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target) Feb 1 03:46:42 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:46:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:46:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:46:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:46:44 localhost systemd[1]: tmp-crun.lWLyZw.mount: Deactivated successfully. Feb 1 03:46:44 localhost podman[98347]: 2026-02-01 08:46:44.718165159 +0000 UTC m=+0.074470631 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git) Feb 1 03:46:44 localhost podman[98346]: 2026-02-01 08:46:44.73635333 +0000 UTC m=+0.093577050 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:46:44 localhost podman[98347]: 2026-02-01 08:46:44.738457695 +0000 UTC m=+0.094763127 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:46:44 localhost podman[98347]: unhealthy Feb 1 03:46:44 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:46:44 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:46:44 localhost podman[98346]: 2026-02-01 08:46:44.819375094 +0000 UTC m=+0.176598804 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true) Feb 1 03:46:44 localhost podman[98346]: unhealthy Feb 1 03:46:44 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:46:44 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:46:44 localhost podman[98345]: 2026-02-01 08:46:44.820958973 +0000 UTC m=+0.180787054 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:46:45 localhost podman[98345]: 2026-02-01 08:46:45.010289429 +0000 UTC m=+0.370117490 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:46:45 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:46:45 localhost systemd[1]: tmp-crun.Y5eb2T.mount: Deactivated successfully. Feb 1 03:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:46:56 localhost systemd[1]: tmp-crun.feEnJt.mount: Deactivated successfully. Feb 1 03:46:56 localhost podman[98414]: 2026-02-01 08:46:56.731974544 +0000 UTC m=+0.081537669 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, release=1766032510) Feb 1 03:46:56 localhost podman[98415]: 2026-02-01 08:46:56.750056493 +0000 UTC m=+0.093927622 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:46:56 localhost podman[98415]: 2026-02-01 08:46:56.763514489 +0000 UTC m=+0.107385688 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:46:56 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:46:56 localhost podman[98414]: 2026-02-01 08:46:56.78946034 +0000 UTC m=+0.139023475 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:46:56 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:46:56 localhost podman[98413]: 2026-02-01 08:46:56.842950011 +0000 UTC m=+0.190129052 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team) Feb 1 03:46:56 localhost podman[98413]: 2026-02-01 08:46:56.852854817 +0000 UTC m=+0.200033878 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible) Feb 1 03:46:56 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:46:57 localhost systemd[1]: tmp-crun.C3eR6Q.mount: Deactivated successfully. Feb 1 03:47:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:47:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:47:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:47:06 localhost systemd[1]: tmp-crun.8s1aiM.mount: Deactivated successfully. Feb 1 03:47:06 localhost podman[98555]: 2026-02-01 08:47:06.749231481 +0000 UTC m=+0.100267088 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:47:06 localhost systemd[1]: tmp-crun.wNUonS.mount: Deactivated successfully. Feb 1 03:47:06 localhost podman[98556]: 2026-02-01 08:47:06.803718764 +0000 UTC m=+0.153015707 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 1 03:47:06 localhost podman[98556]: 2026-02-01 08:47:06.839431986 +0000 UTC m=+0.188728959 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 1 03:47:06 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:47:06 localhost podman[98554]: 2026-02-01 08:47:06.8583355 +0000 UTC m=+0.213451202 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Feb 1 03:47:06 localhost podman[98555]: 2026-02-01 08:47:06.865686817 +0000 UTC m=+0.216722484 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 1 03:47:06 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:47:06 localhost podman[98554]: 2026-02-01 08:47:06.89332408 +0000 UTC m=+0.248439792 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:47:06 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:47:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:47:12 localhost systemd[1]: tmp-crun.slO7Fc.mount: Deactivated successfully. Feb 1 03:47:12 localhost podman[98626]: 2026-02-01 08:47:12.73171956 +0000 UTC m=+0.086918275 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=) Feb 1 03:47:13 localhost podman[98626]: 2026-02-01 08:47:13.108569586 +0000 UTC m=+0.463768291 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team) Feb 1 03:47:13 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:47:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:47:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:47:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:47:15 localhost systemd[1]: tmp-crun.Uh3aTd.mount: Deactivated successfully. Feb 1 03:47:15 localhost podman[98649]: 2026-02-01 08:47:15.747187732 +0000 UTC m=+0.105167548 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:10:14Z) Feb 1 03:47:15 localhost systemd[1]: tmp-crun.rUHwav.mount: Deactivated successfully. Feb 1 03:47:15 localhost podman[98650]: 2026-02-01 08:47:15.797703052 +0000 UTC m=+0.153690577 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:47:15 localhost podman[98651]: 2026-02-01 08:47:15.847707536 +0000 UTC m=+0.196778377 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com) Feb 1 03:47:15 localhost podman[98650]: 2026-02-01 08:47:15.866144226 +0000 UTC m=+0.222131741 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:56:19Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5) Feb 1 03:47:15 localhost podman[98651]: 2026-02-01 08:47:15.866358202 +0000 UTC m=+0.215429013 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4) Feb 1 03:47:15 localhost podman[98650]: unhealthy Feb 1 03:47:15 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:47:15 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:47:15 localhost podman[98651]: unhealthy Feb 1 03:47:15 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:47:15 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:47:15 localhost podman[98649]: 2026-02-01 08:47:15.993538189 +0000 UTC m=+0.351517975 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 1 03:47:16 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:47:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:47:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:47:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:47:27 localhost podman[98715]: 2026-02-01 08:47:27.719550239 +0000 UTC m=+0.082065965 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:47:27 localhost systemd[1]: tmp-crun.XoXMI6.mount: Deactivated successfully. Feb 1 03:47:27 localhost podman[98716]: 2026-02-01 08:47:27.785084432 +0000 UTC m=+0.139302882 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z) Feb 1 03:47:27 localhost podman[98715]: 2026-02-01 08:47:27.800866319 +0000 UTC m=+0.163381975 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, container_name=collectd, vendor=Red Hat, Inc.) Feb 1 03:47:27 localhost podman[98722]: 2026-02-01 08:47:27.756068226 +0000 UTC m=+0.104361083 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:47:27 localhost podman[98716]: 2026-02-01 08:47:27.813301983 +0000 UTC m=+0.167520353 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:47:27 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:47:27 localhost podman[98722]: 2026-02-01 08:47:27.83488506 +0000 UTC m=+0.183177917 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Feb 1 03:47:27 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:47:27 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:47:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:47:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:47:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:47:37 localhost podman[98781]: 2026-02-01 08:47:37.730966584 +0000 UTC m=+0.086834942 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 1 03:47:37 localhost systemd[1]: tmp-crun.nM0rXv.mount: Deactivated successfully. Feb 1 03:47:37 localhost podman[98781]: 2026-02-01 08:47:37.795376423 +0000 UTC m=+0.151244741 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 1 03:47:37 localhost podman[98782]: 2026-02-01 08:47:37.80663816 +0000 UTC m=+0.158931288 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, container_name=logrotate_crond, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:47:37 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:47:37 localhost podman[98783]: 2026-02-01 08:47:37.853944031 +0000 UTC m=+0.201746781 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z) Feb 1 03:47:37 localhost podman[98782]: 2026-02-01 08:47:37.867341824 +0000 UTC m=+0.219634952 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 1 03:47:37 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:47:37 localhost podman[98783]: 2026-02-01 08:47:37.883213305 +0000 UTC m=+0.231016085 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510) Feb 1 03:47:37 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:47:42 localhost sshd[98853]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:47:43 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:47:43 localhost recover_tripleo_nova_virtqemud[98859]: 61284 Feb 1 03:47:43 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:47:43 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:47:43 localhost podman[98855]: 2026-02-01 08:47:43.628860431 +0000 UTC m=+0.100630869 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git) Feb 1 03:47:44 localhost podman[98855]: 2026-02-01 08:47:44.061320664 +0000 UTC m=+0.533091092 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:47:44 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:47:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:47:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:47:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:47:46 localhost podman[98882]: 2026-02-01 08:47:46.728096759 +0000 UTC m=+0.079727362 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller) Feb 1 03:47:46 localhost podman[98882]: 2026-02-01 08:47:46.740871314 +0000 UTC m=+0.092501917 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:36:40Z) Feb 1 03:47:46 localhost systemd[1]: tmp-crun.KS5JoN.mount: Deactivated successfully. Feb 1 03:47:46 localhost podman[98882]: unhealthy Feb 1 03:47:46 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:47:46 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:47:46 localhost podman[98880]: 2026-02-01 08:47:46.841284625 +0000 UTC m=+0.196740277 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=metrics_qdr, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Feb 1 03:47:46 localhost podman[98881]: 2026-02-01 08:47:46.798146022 +0000 UTC m=+0.150122276 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent) Feb 1 03:47:46 localhost podman[98881]: 2026-02-01 08:47:46.878124432 +0000 UTC m=+0.230100686 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 1 03:47:46 localhost podman[98881]: unhealthy Feb 1 03:47:46 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:47:46 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:47:47 localhost podman[98880]: 2026-02-01 08:47:47.059267765 +0000 UTC m=+0.414723477 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:47:47 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:47:47 localhost systemd[1]: session-28.scope: Deactivated successfully. Feb 1 03:47:47 localhost systemd[1]: session-28.scope: Consumed 7min 24.366s CPU time. Feb 1 03:47:47 localhost systemd-logind[759]: Session 28 logged out. Waiting for processes to exit. Feb 1 03:47:47 localhost systemd-logind[759]: Removed session 28. Feb 1 03:47:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:47:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:47:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:47:57 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 1 03:47:57 localhost systemd[35572]: Activating special unit Exit the Session... Feb 1 03:47:57 localhost systemd[35572]: Removed slice User Background Tasks Slice. Feb 1 03:47:57 localhost systemd[35572]: Stopped target Main User Target. Feb 1 03:47:57 localhost systemd[35572]: Stopped target Basic System. Feb 1 03:47:57 localhost systemd[35572]: Stopped target Paths. Feb 1 03:47:57 localhost systemd[35572]: Stopped target Sockets. Feb 1 03:47:57 localhost systemd[35572]: Stopped target Timers. Feb 1 03:47:57 localhost systemd[35572]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 1 03:47:57 localhost systemd[35572]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 03:47:57 localhost systemd[35572]: Closed D-Bus User Message Bus Socket. Feb 1 03:47:57 localhost systemd[35572]: Stopped Create User's Volatile Files and Directories. Feb 1 03:47:57 localhost systemd[35572]: Removed slice User Application Slice. Feb 1 03:47:57 localhost systemd[35572]: Reached target Shutdown. Feb 1 03:47:57 localhost systemd[35572]: Finished Exit the Session. Feb 1 03:47:57 localhost systemd[35572]: Reached target Exit the Session. Feb 1 03:47:57 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 1 03:47:57 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 1 03:47:57 localhost systemd[1]: user@1003.service: Consumed 5.450s CPU time, read 0B from disk, written 7.0K to disk. Feb 1 03:47:57 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 1 03:47:57 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 1 03:47:57 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 1 03:47:57 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 1 03:47:57 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 1 03:47:57 localhost systemd[1]: user-1003.slice: Consumed 7min 29.846s CPU time. Feb 1 03:47:58 localhost podman[98952]: 2026-02-01 08:47:58.002098072 +0000 UTC m=+0.099155643 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container) Feb 1 03:47:58 localhost podman[98952]: 2026-02-01 08:47:58.039532178 +0000 UTC m=+0.136589719 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 1 03:47:58 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:47:58 localhost podman[98951]: 2026-02-01 08:47:58.059791063 +0000 UTC m=+0.162887851 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1766032510, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 1 03:47:58 localhost podman[98953]: 2026-02-01 08:47:58.096643241 +0000 UTC m=+0.189255435 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:47:58 localhost podman[98953]: 2026-02-01 08:47:58.108271501 +0000 UTC m=+0.200883705 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, architecture=x86_64, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:47:58 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:47:58 localhost podman[98951]: 2026-02-01 08:47:58.148977947 +0000 UTC m=+0.252074695 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container) Feb 1 03:47:58 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:47:58 localhost systemd[1]: tmp-crun.bcWs4y.mount: Deactivated successfully. Feb 1 03:48:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:48:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:48:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:48:08 localhost podman[99098]: 2026-02-01 08:48:08.769762108 +0000 UTC m=+0.073104898 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=logrotate_crond, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:48:08 localhost podman[99098]: 2026-02-01 08:48:08.781375548 +0000 UTC m=+0.084718328 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 1 03:48:08 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:48:08 localhost podman[99099]: 2026-02-01 08:48:08.840533495 +0000 UTC m=+0.141386238 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:48:08 localhost podman[99099]: 2026-02-01 08:48:08.8743925 +0000 UTC m=+0.175245253 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, tcib_managed=true) Feb 1 03:48:08 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:48:08 localhost podman[99097]: 2026-02-01 08:48:08.900530147 +0000 UTC m=+0.203984330 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:07:47Z, tcib_managed=true, release=1766032510) Feb 1 03:48:08 localhost podman[99097]: 2026-02-01 08:48:08.964642046 +0000 UTC m=+0.268096189 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Feb 1 03:48:08 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:48:14 localhost podman[99170]: 2026-02-01 08:48:14.733209099 +0000 UTC m=+0.089369260 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, build-date=2026-01-12T23:32:04Z, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Feb 1 03:48:15 localhost podman[99170]: 2026-02-01 08:48:15.109118287 +0000 UTC m=+0.465278448 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true) Feb 1 03:48:15 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:48:16 localhost sshd[99193]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:48:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:48:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:48:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:48:17 localhost podman[99196]: 2026-02-01 08:48:17.17701525 +0000 UTC m=+0.095825170 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller) Feb 1 03:48:17 localhost podman[99196]: 2026-02-01 08:48:17.219812552 +0000 UTC m=+0.138622502 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:48:17 localhost systemd[1]: tmp-crun.kEtq2F.mount: Deactivated successfully. Feb 1 03:48:17 localhost podman[99196]: unhealthy Feb 1 03:48:17 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:48:17 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:48:17 localhost podman[99225]: 2026-02-01 08:48:17.275486722 +0000 UTC m=+0.091377663 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.13, distribution-scope=public) Feb 1 03:48:17 localhost podman[99195]: 2026-02-01 08:48:17.238883791 +0000 UTC m=+0.159323550 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64) Feb 1 03:48:17 localhost podman[99195]: 2026-02-01 08:48:17.32434494 +0000 UTC m=+0.244784699 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, architecture=x86_64) Feb 1 03:48:17 localhost podman[99195]: unhealthy Feb 1 03:48:17 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:48:17 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:48:17 localhost podman[99225]: 2026-02-01 08:48:17.476521368 +0000 UTC m=+0.292412299 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=) Feb 1 03:48:17 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:48:18 localhost systemd[1]: tmp-crun.NDFxsS.mount: Deactivated successfully. Feb 1 03:48:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:48:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:48:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:48:28 localhost systemd[1]: tmp-crun.laKwlg.mount: Deactivated successfully. Feb 1 03:48:28 localhost podman[99266]: 2026-02-01 08:48:28.726964253 +0000 UTC m=+0.080631969 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Feb 1 03:48:28 localhost podman[99266]: 2026-02-01 08:48:28.766400021 +0000 UTC m=+0.120067757 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible) Feb 1 03:48:28 localhost podman[99268]: 2026-02-01 08:48:28.78192005 +0000 UTC m=+0.130449588 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:48:28 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:48:28 localhost podman[99268]: 2026-02-01 08:48:28.819383987 +0000 UTC m=+0.167913475 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc.) Feb 1 03:48:28 localhost podman[99267]: 2026-02-01 08:48:28.846497914 +0000 UTC m=+0.197476387 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 1 03:48:28 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:48:28 localhost podman[99267]: 2026-02-01 08:48:28.905414204 +0000 UTC m=+0.256392707 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64) Feb 1 03:48:28 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:48:29 localhost systemd[1]: tmp-crun.YD8TIG.mount: Deactivated successfully. Feb 1 03:48:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:48:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:48:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:48:39 localhost podman[99331]: 2026-02-01 08:48:39.71838787 +0000 UTC m=+0.073451519 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 1 03:48:39 localhost systemd[1]: tmp-crun.55Xuij.mount: Deactivated successfully. Feb 1 03:48:39 localhost podman[99329]: 2026-02-01 08:48:39.790021402 +0000 UTC m=+0.148206267 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:48:39 localhost podman[99331]: 2026-02-01 08:48:39.796842862 +0000 UTC m=+0.151906521 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Feb 1 03:48:39 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:48:39 localhost podman[99329]: 2026-02-01 08:48:39.827586981 +0000 UTC m=+0.185771786 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, container_name=ceilometer_agent_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:48:39 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:48:39 localhost podman[99330]: 2026-02-01 08:48:39.888704319 +0000 UTC m=+0.244478981 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public) Feb 1 03:48:39 localhost podman[99330]: 2026-02-01 08:48:39.924460842 +0000 UTC m=+0.280235514 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.component=openstack-cron-container) Feb 1 03:48:39 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:48:40 localhost systemd[1]: tmp-crun.9y9jzr.mount: Deactivated successfully. Feb 1 03:48:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:48:45 localhost systemd[1]: tmp-crun.5PkP8L.mount: Deactivated successfully. Feb 1 03:48:45 localhost podman[99401]: 2026-02-01 08:48:45.73296229 +0000 UTC m=+0.092190668 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:48:46 localhost podman[99401]: 2026-02-01 08:48:46.100596102 +0000 UTC m=+0.459824480 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20260112.1) Feb 1 03:48:46 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:48:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:48:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:48:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:48:47 localhost podman[99427]: 2026-02-01 08:48:47.715410095 +0000 UTC m=+0.066166234 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git) Feb 1 03:48:47 localhost podman[99427]: 2026-02-01 08:48:47.751478788 +0000 UTC m=+0.102234907 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible) Feb 1 03:48:47 localhost podman[99427]: unhealthy Feb 1 03:48:47 localhost systemd[1]: tmp-crun.lZzOct.mount: Deactivated successfully. Feb 1 03:48:47 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:48:47 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:48:47 localhost podman[99426]: 2026-02-01 08:48:47.769476524 +0000 UTC m=+0.126354462 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13) Feb 1 03:48:47 localhost podman[99426]: 2026-02-01 08:48:47.782416234 +0000 UTC m=+0.139294192 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 1 03:48:47 localhost podman[99426]: unhealthy Feb 1 03:48:47 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:48:47 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:48:47 localhost podman[99425]: 2026-02-01 08:48:47.83086506 +0000 UTC m=+0.192647730 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:48:47 localhost podman[99425]: 2026-02-01 08:48:47.989387424 +0000 UTC m=+0.351170144 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5) Feb 1 03:48:48 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:48:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:48:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:48:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:48:59 localhost systemd[1]: tmp-crun.EE7bkA.mount: Deactivated successfully. Feb 1 03:48:59 localhost podman[99491]: 2026-02-01 08:48:59.740165009 +0000 UTC m=+0.095385707 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, version=17.1.13, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:48:59 localhost podman[99491]: 2026-02-01 08:48:59.778384509 +0000 UTC m=+0.133605137 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=collectd, tcib_managed=true, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:48:59 localhost systemd[1]: tmp-crun.oLhn8g.mount: Deactivated successfully. Feb 1 03:48:59 localhost podman[99493]: 2026-02-01 08:48:59.787445809 +0000 UTC m=+0.135873628 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:48:59 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:48:59 localhost podman[99493]: 2026-02-01 08:48:59.799298184 +0000 UTC m=+0.147725983 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:48:59 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:48:59 localhost podman[99492]: 2026-02-01 08:48:59.844591163 +0000 UTC m=+0.198328376 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step5, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc.) Feb 1 03:48:59 localhost podman[99492]: 2026-02-01 08:48:59.86519483 +0000 UTC m=+0.218932053 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, version=17.1.13, container_name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 1 03:48:59 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:49:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:49:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:49:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:49:10 localhost systemd[1]: tmp-crun.cnCIN0.mount: Deactivated successfully. Feb 1 03:49:10 localhost podman[99634]: 2026-02-01 08:49:10.750051915 +0000 UTC m=+0.100810944 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container) Feb 1 03:49:10 localhost podman[99635]: 2026-02-01 08:49:10.788644047 +0000 UTC m=+0.136233708 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:49:10 localhost podman[99635]: 2026-02-01 08:49:10.802236437 +0000 UTC m=+0.149826068 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.expose-services=) Feb 1 03:49:10 localhost podman[99634]: 2026-02-01 08:49:10.802579247 +0000 UTC m=+0.153338266 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64) Feb 1 03:49:10 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:49:10 localhost podman[99636]: 2026-02-01 08:49:10.848211306 +0000 UTC m=+0.191135133 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:49:10 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:49:10 localhost podman[99636]: 2026-02-01 08:49:10.904486133 +0000 UTC m=+0.247409940 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, distribution-scope=public) Feb 1 03:49:10 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:49:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:49:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:49:16 localhost recover_tripleo_nova_virtqemud[99711]: 61284 Feb 1 03:49:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:49:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:49:16 localhost podman[99705]: 2026-02-01 08:49:16.734201125 +0000 UTC m=+0.087580325 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:49:17 localhost podman[99705]: 2026-02-01 08:49:17.141497892 +0000 UTC m=+0.494877082 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public) Feb 1 03:49:17 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:49:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:49:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:49:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:49:18 localhost podman[99732]: 2026-02-01 08:49:18.686382855 +0000 UTC m=+0.049834850 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=ovn_metadata_agent, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 1 03:49:18 localhost podman[99732]: 2026-02-01 08:49:18.729343002 +0000 UTC m=+0.092795017 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:49:18 localhost podman[99732]: unhealthy Feb 1 03:49:18 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:49:18 localhost podman[99733]: 2026-02-01 08:49:18.740059633 +0000 UTC m=+0.094929603 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com) Feb 1 03:49:18 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:49:18 localhost podman[99733]: 2026-02-01 08:49:18.781539574 +0000 UTC m=+0.136409624 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:49:18 localhost podman[99733]: unhealthy Feb 1 03:49:18 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:49:18 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:49:18 localhost podman[99731]: 2026-02-01 08:49:18.783185835 +0000 UTC m=+0.143422920 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:49:19 localhost podman[99731]: 2026-02-01 08:49:19.013418174 +0000 UTC m=+0.373655279 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:49:19 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:49:19 localhost systemd[1]: tmp-crun.eM8htV.mount: Deactivated successfully. Feb 1 03:49:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:49:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:49:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:49:30 localhost podman[99799]: 2026-02-01 08:49:30.732729776 +0000 UTC m=+0.084928793 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 1 03:49:30 localhost podman[99799]: 2026-02-01 08:49:30.747471102 +0000 UTC m=+0.099670069 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, vcs-type=git) Feb 1 03:49:30 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:49:30 localhost systemd[1]: tmp-crun.iJhqwb.mount: Deactivated successfully. Feb 1 03:49:30 localhost podman[99800]: 2026-02-01 08:49:30.851327869 +0000 UTC m=+0.199062618 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, tcib_managed=true, config_id=tripleo_step5, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:49:30 localhost podman[99801]: 2026-02-01 08:49:30.900566438 +0000 UTC m=+0.244174460 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 1 03:49:30 localhost podman[99800]: 2026-02-01 08:49:30.911519957 +0000 UTC m=+0.259254656 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:49:30 localhost podman[99801]: 2026-02-01 08:49:30.912099094 +0000 UTC m=+0.255707096 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:49:30 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:49:30 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:49:31 localhost systemd[1]: tmp-crun.G7YF8y.mount: Deactivated successfully. Feb 1 03:49:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:49:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:49:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:49:41 localhost podman[99864]: 2026-02-01 08:49:41.735981798 +0000 UTC m=+0.097352537 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Feb 1 03:49:41 localhost podman[99864]: 2026-02-01 08:49:41.765528791 +0000 UTC m=+0.126899500 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 1 03:49:41 localhost systemd[1]: tmp-crun.DTb2s0.mount: Deactivated successfully. Feb 1 03:49:41 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:49:41 localhost podman[99865]: 2026-02-01 08:49:41.790980927 +0000 UTC m=+0.148308401 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true) Feb 1 03:49:41 localhost podman[99866]: 2026-02-01 08:49:41.834086598 +0000 UTC m=+0.187139570 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=) Feb 1 03:49:41 localhost podman[99865]: 2026-02-01 08:49:41.853428025 +0000 UTC m=+0.210755499 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron) Feb 1 03:49:41 localhost podman[99866]: 2026-02-01 08:49:41.862975189 +0000 UTC m=+0.216028141 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:49:41 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:49:41 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:49:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:49:47 localhost systemd[1]: tmp-crun.5SIHq6.mount: Deactivated successfully. Feb 1 03:49:47 localhost podman[99936]: 2026-02-01 08:49:47.724852134 +0000 UTC m=+0.087144492 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public) Feb 1 03:49:48 localhost podman[99936]: 2026-02-01 08:49:48.141469279 +0000 UTC m=+0.503761647 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, release=1766032510, build-date=2026-01-12T23:32:04Z) Feb 1 03:49:48 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:49:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:49:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:49:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:49:49 localhost podman[99959]: 2026-02-01 08:49:49.729658089 +0000 UTC m=+0.091908779 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-type=git, container_name=metrics_qdr, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:49:49 localhost podman[99961]: 2026-02-01 08:49:49.780248041 +0000 UTC m=+0.135978220 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1) Feb 1 03:49:49 localhost podman[99961]: 2026-02-01 08:49:49.797954348 +0000 UTC m=+0.153684517 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:49:49 localhost podman[99961]: unhealthy Feb 1 03:49:49 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:49:49 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:49:49 localhost podman[99960]: 2026-02-01 08:49:49.892390554 +0000 UTC m=+0.250895449 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:49:49 localhost podman[99960]: 2026-02-01 08:49:49.912473544 +0000 UTC m=+0.270978469 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:49:49 localhost podman[99960]: unhealthy Feb 1 03:49:49 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:49:49 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:49:49 localhost podman[99959]: 2026-02-01 08:49:49.949472576 +0000 UTC m=+0.311723246 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:49:49 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:50:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:50:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:50:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:50:01 localhost systemd[1]: tmp-crun.yRs9Ah.mount: Deactivated successfully. Feb 1 03:50:01 localhost systemd[1]: tmp-crun.pbuBaY.mount: Deactivated successfully. Feb 1 03:50:01 localhost podman[100028]: 2026-02-01 08:50:01.764851856 +0000 UTC m=+0.123201385 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64) Feb 1 03:50:01 localhost podman[100029]: 2026-02-01 08:50:01.715027447 +0000 UTC m=+0.073726027 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13) Feb 1 03:50:01 localhost podman[100030]: 2026-02-01 08:50:01.741214275 +0000 UTC m=+0.091338340 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, container_name=iscsid, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:50:01 localhost podman[100028]: 2026-02-01 08:50:01.796719099 +0000 UTC m=+0.155068608 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:50:01 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:50:01 localhost podman[100030]: 2026-02-01 08:50:01.826322084 +0000 UTC m=+0.176446159 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=iscsid, config_id=tripleo_step3, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1) Feb 1 03:50:01 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:50:01 localhost podman[100029]: 2026-02-01 08:50:01.845190596 +0000 UTC m=+0.203889166 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5) Feb 1 03:50:01 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:50:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:50:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:50:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:50:12 localhost systemd[1]: tmp-crun.kyn7CH.mount: Deactivated successfully. Feb 1 03:50:12 localhost podman[100170]: 2026-02-01 08:50:12.718702584 +0000 UTC m=+0.077214992 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron) Feb 1 03:50:12 localhost podman[100171]: 2026-02-01 08:50:12.741470963 +0000 UTC m=+0.095281276 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:50:12 localhost podman[100170]: 2026-02-01 08:50:12.800275627 +0000 UTC m=+0.158788075 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true, container_name=logrotate_crond, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:50:12 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:50:12 localhost podman[100169]: 2026-02-01 08:50:12.771093651 +0000 UTC m=+0.128857936 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:50:12 localhost podman[100171]: 2026-02-01 08:50:12.826337347 +0000 UTC m=+0.180147640 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Feb 1 03:50:12 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:50:12 localhost podman[100169]: 2026-02-01 08:50:12.855530694 +0000 UTC m=+0.213295009 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=) Feb 1 03:50:12 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:50:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:50:18 localhost podman[100241]: 2026-02-01 08:50:18.736617458 +0000 UTC m=+0.090289302 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=nova_migration_target, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 1 03:50:19 localhost podman[100241]: 2026-02-01 08:50:19.126477464 +0000 UTC m=+0.480149308 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 1 03:50:19 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:50:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:50:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:50:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:50:20 localhost podman[100265]: 2026-02-01 08:50:20.725114591 +0000 UTC m=+0.082608377 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64) Feb 1 03:50:20 localhost podman[100265]: 2026-02-01 08:50:20.745482856 +0000 UTC m=+0.102976642 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13) Feb 1 03:50:20 localhost podman[100265]: unhealthy Feb 1 03:50:20 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:50:20 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:50:20 localhost systemd[1]: tmp-crun.XE3SmM.mount: Deactivated successfully. Feb 1 03:50:20 localhost podman[100264]: 2026-02-01 08:50:20.840594495 +0000 UTC m=+0.200762373 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, release=1766032510, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 1 03:50:20 localhost podman[100266]: 2026-02-01 08:50:20.892670734 +0000 UTC m=+0.246855459 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 1 03:50:20 localhost podman[100266]: 2026-02-01 08:50:20.909262783 +0000 UTC m=+0.263447518 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, release=1766032510, build-date=2026-01-12T22:36:40Z) Feb 1 03:50:20 localhost podman[100266]: unhealthy Feb 1 03:50:20 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:50:20 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:50:21 localhost podman[100264]: 2026-02-01 08:50:21.029781542 +0000 UTC m=+0.389949380 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:50:21 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:50:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:50:29 localhost recover_tripleo_nova_virtqemud[100333]: 61284 Feb 1 03:50:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:50:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:50:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:50:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:50:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:50:32 localhost systemd[1]: tmp-crun.LcW6jo.mount: Deactivated successfully. Feb 1 03:50:32 localhost systemd[1]: tmp-crun.HJ6WFD.mount: Deactivated successfully. Feb 1 03:50:32 localhost podman[100335]: 2026-02-01 08:50:32.733046387 +0000 UTC m=+0.090838449 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, version=17.1.13, distribution-scope=public) Feb 1 03:50:32 localhost podman[100334]: 2026-02-01 08:50:32.704448509 +0000 UTC m=+0.065635095 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:50:32 localhost podman[100336]: 2026-02-01 08:50:32.7670398 +0000 UTC m=+0.122486110 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, container_name=iscsid, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z) Feb 1 03:50:32 localhost podman[100334]: 2026-02-01 08:50:32.787766517 +0000 UTC m=+0.148953083 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true) Feb 1 03:50:32 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:50:32 localhost podman[100336]: 2026-02-01 08:50:32.797390932 +0000 UTC m=+0.152837312 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:50:32 localhost podman[100335]: 2026-02-01 08:50:32.806337886 +0000 UTC m=+0.164129958 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git) Feb 1 03:50:32 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:50:32 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:50:36 localhost sshd[100402]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:50:38 localhost sshd[100404]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:50:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:50:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:50:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:50:43 localhost systemd[1]: tmp-crun.WtSPpZ.mount: Deactivated successfully. Feb 1 03:50:43 localhost podman[100406]: 2026-02-01 08:50:43.712211016 +0000 UTC m=+0.072736253 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:50:43 localhost systemd[1]: tmp-crun.E176vw.mount: Deactivated successfully. Feb 1 03:50:43 localhost podman[100407]: 2026-02-01 08:50:43.740575588 +0000 UTC m=+0.094315336 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.expose-services=) Feb 1 03:50:43 localhost podman[100407]: 2026-02-01 08:50:43.7504401 +0000 UTC m=+0.104179878 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, version=17.1.13) Feb 1 03:50:43 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:50:43 localhost podman[100406]: 2026-02-01 08:50:43.795852194 +0000 UTC m=+0.156377471 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:50:43 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:50:43 localhost podman[100413]: 2026-02-01 08:50:43.800416994 +0000 UTC m=+0.146539758 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.) Feb 1 03:50:43 localhost podman[100413]: 2026-02-01 08:50:43.883752622 +0000 UTC m=+0.229875456 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc.) Feb 1 03:50:43 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:50:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:50:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 5701 writes, 25K keys, 5701 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5701 writes, 740 syncs, 7.70 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:50:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:50:49 localhost podman[100476]: 2026-02-01 08:50:49.722611771 +0000 UTC m=+0.085013830 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:50:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 03:50:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 4896 writes, 22K keys, 4896 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4896 writes, 685 syncs, 7.15 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 03:50:50 localhost podman[100476]: 2026-02-01 08:50:50.109417823 +0000 UTC m=+0.471819822 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:50:50 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:50:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:50:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:50:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:50:51 localhost systemd[1]: tmp-crun.bFsQeL.mount: Deactivated successfully. Feb 1 03:50:51 localhost podman[100500]: 2026-02-01 08:50:51.758150407 +0000 UTC m=+0.118720105 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 1 03:50:51 localhost podman[100499]: 2026-02-01 08:50:51.713006471 +0000 UTC m=+0.071469134 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_id=tripleo_step1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public) Feb 1 03:50:51 localhost podman[100501]: 2026-02-01 08:50:51.734623134 +0000 UTC m=+0.087013171 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z) Feb 1 03:50:51 localhost podman[100500]: 2026-02-01 08:50:51.800256359 +0000 UTC m=+0.160826057 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:50:51 localhost podman[100500]: unhealthy Feb 1 03:50:51 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:50:51 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:50:51 localhost podman[100501]: 2026-02-01 08:50:51.817360594 +0000 UTC m=+0.169750631 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:50:51 localhost podman[100501]: unhealthy Feb 1 03:50:51 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:50:51 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:50:51 localhost podman[100499]: 2026-02-01 08:50:51.926490174 +0000 UTC m=+0.284952867 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=metrics_qdr, config_id=tripleo_step1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git) Feb 1 03:50:51 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:51:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:51:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:51:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:51:03 localhost podman[100567]: 2026-02-01 08:51:03.723412044 +0000 UTC m=+0.066003897 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.buildah.version=1.41.5, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3) Feb 1 03:51:03 localhost podman[100567]: 2026-02-01 08:51:03.739298371 +0000 UTC m=+0.081890234 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 03:51:03 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:51:03 localhost podman[100565]: 2026-02-01 08:51:03.797695373 +0000 UTC m=+0.145563058 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, architecture=x86_64) Feb 1 03:51:03 localhost podman[100565]: 2026-02-01 08:51:03.806503024 +0000 UTC m=+0.154370639 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=) Feb 1 03:51:03 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:51:03 localhost podman[100566]: 2026-02-01 08:51:03.910246618 +0000 UTC m=+0.256976488 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, config_id=tripleo_step5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:51:03 localhost podman[100566]: 2026-02-01 08:51:03.966124543 +0000 UTC m=+0.312854373 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:51:03 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:51:04 localhost systemd[1]: tmp-crun.8MLFN1.mount: Deactivated successfully. Feb 1 03:51:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:51:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:51:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:51:14 localhost systemd[1]: tmp-crun.HRFoN2.mount: Deactivated successfully. Feb 1 03:51:14 localhost podman[100709]: 2026-02-01 08:51:14.782222947 +0000 UTC m=+0.131508208 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:51:14 localhost podman[100708]: 2026-02-01 08:51:14.753830535 +0000 UTC m=+0.106201370 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:51:14 localhost podman[100708]: 2026-02-01 08:51:14.837415811 +0000 UTC m=+0.189786636 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:51:14 localhost podman[100709]: 2026-02-01 08:51:14.847611114 +0000 UTC m=+0.196896455 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1766032510) Feb 1 03:51:14 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:51:14 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:51:14 localhost podman[100707]: 2026-02-01 08:51:14.840546667 +0000 UTC m=+0.189603061 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 1 03:51:14 localhost podman[100707]: 2026-02-01 08:51:14.92144457 +0000 UTC m=+0.270501014 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git) Feb 1 03:51:14 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:51:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:51:20 localhost podman[100780]: 2026-02-01 08:51:20.70458905 +0000 UTC m=+0.064699147 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:51:21 localhost podman[100780]: 2026-02-01 08:51:21.067653283 +0000 UTC m=+0.427763360 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64) Feb 1 03:51:21 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:51:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:51:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:51:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:51:22 localhost systemd[1]: tmp-crun.2QsQ7m.mount: Deactivated successfully. Feb 1 03:51:22 localhost podman[100805]: 2026-02-01 08:51:22.732131011 +0000 UTC m=+0.083114312 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:51:22 localhost podman[100805]: 2026-02-01 08:51:22.789323537 +0000 UTC m=+0.140306838 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, container_name=ovn_controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:51:22 localhost podman[100805]: unhealthy Feb 1 03:51:22 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:51:22 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:51:22 localhost podman[100804]: 2026-02-01 08:51:22.845081308 +0000 UTC m=+0.199701370 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, architecture=x86_64) Feb 1 03:51:22 localhost podman[100803]: 2026-02-01 08:51:22.795050833 +0000 UTC m=+0.154866045 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true) Feb 1 03:51:22 localhost podman[100804]: 2026-02-01 08:51:22.863496823 +0000 UTC m=+0.218116865 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:56:19Z, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:51:22 localhost podman[100804]: unhealthy Feb 1 03:51:22 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:51:22 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:51:23 localhost podman[100803]: 2026-02-01 08:51:23.016477149 +0000 UTC m=+0.376292351 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step1, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Feb 1 03:51:23 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:51:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:51:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:51:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:51:34 localhost podman[100869]: 2026-02-01 08:51:34.726190187 +0000 UTC m=+0.085338151 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, release=1766032510) Feb 1 03:51:34 localhost podman[100869]: 2026-02-01 08:51:34.740380852 +0000 UTC m=+0.099528806 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, container_name=collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible) Feb 1 03:51:34 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:51:34 localhost podman[100870]: 2026-02-01 08:51:34.78201357 +0000 UTC m=+0.138713049 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, architecture=x86_64, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 1 03:51:34 localhost podman[100871]: 2026-02-01 08:51:34.835296785 +0000 UTC m=+0.190413945 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, container_name=iscsid, tcib_managed=true, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container) Feb 1 03:51:34 localhost podman[100870]: 2026-02-01 08:51:34.842512886 +0000 UTC m=+0.199212445 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:51:34 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:51:34 localhost podman[100871]: 2026-02-01 08:51:34.881098941 +0000 UTC m=+0.236216131 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:51:34 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:51:35 localhost systemd[1]: tmp-crun.RgZYnu.mount: Deactivated successfully. Feb 1 03:51:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:51:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:51:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:51:45 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:51:45 localhost recover_tripleo_nova_virtqemud[100951]: 61284 Feb 1 03:51:45 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:51:45 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:51:45 localhost podman[100933]: 2026-02-01 08:51:45.742958898 +0000 UTC m=+0.091067516 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 1 03:51:45 localhost podman[100933]: 2026-02-01 08:51:45.754218193 +0000 UTC m=+0.102326801 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:51:45 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:51:45 localhost podman[100934]: 2026-02-01 08:51:45.853151001 +0000 UTC m=+0.198835404 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com) Feb 1 03:51:45 localhost podman[100932]: 2026-02-01 08:51:45.88735431 +0000 UTC m=+0.238781510 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z, distribution-scope=public, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 1 03:51:45 localhost podman[100932]: 2026-02-01 08:51:45.910379737 +0000 UTC m=+0.261807007 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:51:45 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:51:45 localhost podman[100934]: 2026-02-01 08:51:45.924000175 +0000 UTC m=+0.269684608 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13) Feb 1 03:51:45 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:51:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:51:51 localhost podman[101007]: 2026-02-01 08:51:51.731837332 +0000 UTC m=+0.088376514 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git) Feb 1 03:51:52 localhost podman[101007]: 2026-02-01 08:51:52.104071897 +0000 UTC m=+0.460611089 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:51:52 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:51:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:51:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:51:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:51:53 localhost systemd[1]: tmp-crun.0rD7pS.mount: Deactivated successfully. Feb 1 03:51:53 localhost podman[101031]: 2026-02-01 08:51:53.720219331 +0000 UTC m=+0.080468151 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:51:53 localhost systemd[1]: tmp-crun.siubFg.mount: Deactivated successfully. Feb 1 03:51:53 localhost podman[101032]: 2026-02-01 08:51:53.732664432 +0000 UTC m=+0.086780405 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, distribution-scope=public) Feb 1 03:51:53 localhost podman[101031]: 2026-02-01 08:51:53.736326915 +0000 UTC m=+0.096575675 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, distribution-scope=public, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64) Feb 1 03:51:53 localhost podman[101031]: unhealthy Feb 1 03:51:53 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:51:53 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:51:53 localhost podman[101032]: 2026-02-01 08:51:53.767665837 +0000 UTC m=+0.121781760 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:51:53 localhost podman[101032]: unhealthy Feb 1 03:51:53 localhost podman[101030]: 2026-02-01 08:51:53.777249651 +0000 UTC m=+0.137291495 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510) Feb 1 03:51:53 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:51:53 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:51:53 localhost podman[101030]: 2026-02-01 08:51:53.951792937 +0000 UTC m=+0.311834821 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:51:53 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:52:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:52:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:52:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:52:05 localhost podman[101096]: 2026-02-01 08:52:05.741136535 +0000 UTC m=+0.098033030 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 1 03:52:05 localhost podman[101096]: 2026-02-01 08:52:05.7497772 +0000 UTC m=+0.106673725 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 1 03:52:05 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:52:05 localhost systemd[1]: tmp-crun.fYwNFN.mount: Deactivated successfully. Feb 1 03:52:05 localhost podman[101097]: 2026-02-01 08:52:05.846167608 +0000 UTC m=+0.198234005 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, distribution-scope=public) Feb 1 03:52:05 localhost podman[101097]: 2026-02-01 08:52:05.879040677 +0000 UTC m=+0.231107134 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=) Feb 1 03:52:05 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:52:05 localhost podman[101098]: 2026-02-01 08:52:05.902184067 +0000 UTC m=+0.251234841 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:52:05 localhost podman[101098]: 2026-02-01 08:52:05.940517554 +0000 UTC m=+0.289568298 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:34:43Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com) Feb 1 03:52:05 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:52:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:52:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:52:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:52:16 localhost podman[101237]: 2026-02-01 08:52:16.745662415 +0000 UTC m=+0.088772556 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:52:16 localhost podman[101237]: 2026-02-01 08:52:16.786420036 +0000 UTC m=+0.129530177 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:52:16 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:52:16 localhost systemd[1]: tmp-crun.fHfva1.mount: Deactivated successfully. Feb 1 03:52:16 localhost podman[101236]: 2026-02-01 08:52:16.861932134 +0000 UTC m=+0.212812083 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 1 03:52:16 localhost podman[101236]: 2026-02-01 08:52:16.896439233 +0000 UTC m=+0.247319182 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64) Feb 1 03:52:16 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:52:16 localhost podman[101238]: 2026-02-01 08:52:16.812420124 +0000 UTC m=+0.153901195 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible) Feb 1 03:52:16 localhost podman[101238]: 2026-02-01 08:52:16.944851648 +0000 UTC m=+0.286332679 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:52:16 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:52:17 localhost systemd[1]: tmp-crun.vWbpCO.mount: Deactivated successfully. Feb 1 03:52:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:52:22 localhost podman[101309]: 2026-02-01 08:52:22.708635705 +0000 UTC m=+0.067804613 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:52:23 localhost podman[101309]: 2026-02-01 08:52:23.083789259 +0000 UTC m=+0.442958137 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container) Feb 1 03:52:23 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:52:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:52:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:52:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:52:24 localhost podman[101336]: 2026-02-01 08:52:24.72817486 +0000 UTC m=+0.074067495 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true) Feb 1 03:52:24 localhost podman[101336]: 2026-02-01 08:52:24.775774821 +0000 UTC m=+0.121667456 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:52:24 localhost systemd[1]: tmp-crun.up0yAG.mount: Deactivated successfully. Feb 1 03:52:24 localhost podman[101336]: unhealthy Feb 1 03:52:24 localhost podman[101335]: 2026-02-01 08:52:24.788093878 +0000 UTC m=+0.137510871 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:52:24 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:52:24 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:52:24 localhost podman[101335]: 2026-02-01 08:52:24.802069397 +0000 UTC m=+0.151486390 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:52:24 localhost podman[101335]: unhealthy Feb 1 03:52:24 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:52:24 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:52:24 localhost podman[101334]: 2026-02-01 08:52:24.902565382 +0000 UTC m=+0.251906893 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, distribution-scope=public, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:52:25 localhost podman[101334]: 2026-02-01 08:52:25.140893117 +0000 UTC m=+0.490234698 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=metrics_qdr, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:52:25 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:52:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:52:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:52:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:52:36 localhost podman[101401]: 2026-02-01 08:52:36.733944117 +0000 UTC m=+0.085874067 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:52:36 localhost podman[101401]: 2026-02-01 08:52:36.74479911 +0000 UTC m=+0.096729060 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, container_name=collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 1 03:52:36 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:52:36 localhost systemd[1]: tmp-crun.Hp4nqF.mount: Deactivated successfully. Feb 1 03:52:36 localhost podman[101403]: 2026-02-01 08:52:36.789682377 +0000 UTC m=+0.139191573 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, config_id=tripleo_step3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1) Feb 1 03:52:36 localhost podman[101403]: 2026-02-01 08:52:36.845577253 +0000 UTC m=+0.195086429 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Feb 1 03:52:36 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:52:36 localhost podman[101402]: 2026-02-01 08:52:36.851494525 +0000 UTC m=+0.200219866 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container) Feb 1 03:52:36 localhost podman[101402]: 2026-02-01 08:52:36.935429991 +0000 UTC m=+0.284155332 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:52:36 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:52:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:52:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:52:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:52:47 localhost podman[101467]: 2026-02-01 08:52:47.727642844 +0000 UTC m=+0.089000103 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, container_name=logrotate_crond, distribution-scope=public, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible) Feb 1 03:52:47 localhost podman[101468]: 2026-02-01 08:52:47.741701885 +0000 UTC m=+0.097353249 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:52:47 localhost podman[101467]: 2026-02-01 08:52:47.764460264 +0000 UTC m=+0.125817503 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:52:47 localhost systemd[1]: tmp-crun.BQF3MD.mount: Deactivated successfully. Feb 1 03:52:47 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:52:47 localhost podman[101468]: 2026-02-01 08:52:47.791385619 +0000 UTC m=+0.147036953 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, distribution-scope=public) Feb 1 03:52:47 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:52:47 localhost podman[101466]: 2026-02-01 08:52:47.776196024 +0000 UTC m=+0.135333035 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container) Feb 1 03:52:47 localhost podman[101466]: 2026-02-01 08:52:47.858424367 +0000 UTC m=+0.217561348 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 1 03:52:47 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:52:53 localhost sshd[101539]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:52:53 localhost systemd[1]: tmp-crun.7684ta.mount: Deactivated successfully. Feb 1 03:52:53 localhost podman[101541]: 2026-02-01 08:52:53.561612373 +0000 UTC m=+0.098938738 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:52:53 localhost podman[101541]: 2026-02-01 08:52:53.983560264 +0000 UTC m=+0.520886639 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, container_name=nova_migration_target, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 1 03:52:54 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:52:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:52:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:52:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:52:55 localhost podman[101564]: 2026-02-01 08:52:55.713210132 +0000 UTC m=+0.076269172 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 1 03:52:55 localhost podman[101566]: 2026-02-01 08:52:55.754165359 +0000 UTC m=+0.117123866 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, tcib_managed=true, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public) Feb 1 03:52:55 localhost podman[101566]: 2026-02-01 08:52:55.796442337 +0000 UTC m=+0.159400824 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:52:55 localhost podman[101566]: unhealthy Feb 1 03:52:55 localhost podman[101565]: 2026-02-01 08:52:55.80665317 +0000 UTC m=+0.169677719 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Feb 1 03:52:55 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:52:55 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:52:55 localhost podman[101565]: 2026-02-01 08:52:55.847414011 +0000 UTC m=+0.210438570 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Feb 1 03:52:55 localhost podman[101565]: unhealthy Feb 1 03:52:55 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:52:55 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:52:55 localhost podman[101564]: 2026-02-01 08:52:55.990498933 +0000 UTC m=+0.353557973 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container) Feb 1 03:52:56 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:53:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:53:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:53:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:53:07 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:53:07 localhost recover_tripleo_nova_virtqemud[101647]: 61284 Feb 1 03:53:07 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:53:07 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:53:07 localhost systemd[1]: tmp-crun.YNxwI0.mount: Deactivated successfully. Feb 1 03:53:07 localhost podman[101634]: 2026-02-01 08:53:07.733925167 +0000 UTC m=+0.089765786 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5) Feb 1 03:53:07 localhost podman[101635]: 2026-02-01 08:53:07.742432538 +0000 UTC m=+0.092514280 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 1 03:53:07 localhost systemd[1]: tmp-crun.YbtQnR.mount: Deactivated successfully. Feb 1 03:53:07 localhost podman[101633]: 2026-02-01 08:53:07.791378031 +0000 UTC m=+0.150282354 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 1 03:53:07 localhost podman[101635]: 2026-02-01 08:53:07.806061181 +0000 UTC m=+0.156142913 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:53:07 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:53:07 localhost podman[101634]: 2026-02-01 08:53:07.818268496 +0000 UTC m=+0.174109135 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, architecture=x86_64, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, vcs-type=git) Feb 1 03:53:07 localhost podman[101633]: 2026-02-01 08:53:07.826595081 +0000 UTC m=+0.185499454 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13) Feb 1 03:53:07 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:53:07 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:53:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:53:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:53:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:53:18 localhost systemd[1]: tmp-crun.udG5N3.mount: Deactivated successfully. Feb 1 03:53:18 localhost podman[101829]: 2026-02-01 08:53:18.736156072 +0000 UTC m=+0.090289072 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:53:18 localhost podman[101828]: 2026-02-01 08:53:18.786057364 +0000 UTC m=+0.141313358 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4) Feb 1 03:53:18 localhost podman[101828]: 2026-02-01 08:53:18.793298566 +0000 UTC m=+0.148554540 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z) Feb 1 03:53:18 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:53:18 localhost podman[101829]: 2026-02-01 08:53:18.843725114 +0000 UTC m=+0.197858174 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 1 03:53:18 localhost podman[101827]: 2026-02-01 08:53:18.856123264 +0000 UTC m=+0.214320868 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:53:18 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:53:18 localhost podman[101827]: 2026-02-01 08:53:18.88529735 +0000 UTC m=+0.243494904 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4) Feb 1 03:53:18 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:53:24 localhost podman[101901]: 2026-02-01 08:53:24.705152166 +0000 UTC m=+0.070867166 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step4, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, version=17.1.13, distribution-scope=public) Feb 1 03:53:25 localhost podman[101901]: 2026-02-01 08:53:25.064051631 +0000 UTC m=+0.429766681 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:53:25 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:53:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:53:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:53:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:53:26 localhost systemd[1]: tmp-crun.ubhbZF.mount: Deactivated successfully. Feb 1 03:53:26 localhost podman[101925]: 2026-02-01 08:53:26.752919107 +0000 UTC m=+0.102242449 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:53:26 localhost podman[101925]: 2026-02-01 08:53:26.796834845 +0000 UTC m=+0.146158127 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, tcib_managed=true, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z) Feb 1 03:53:26 localhost podman[101925]: unhealthy Feb 1 03:53:26 localhost podman[101924]: 2026-02-01 08:53:26.807942196 +0000 UTC m=+0.160818518 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:53:26 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:53:26 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:53:26 localhost podman[101926]: 2026-02-01 08:53:26.857754365 +0000 UTC m=+0.202975222 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:53:26 localhost podman[101926]: 2026-02-01 08:53:26.902854439 +0000 UTC m=+0.248075276 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:53:26 localhost podman[101926]: unhealthy Feb 1 03:53:26 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:53:26 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:53:27 localhost podman[101924]: 2026-02-01 08:53:27.036595173 +0000 UTC m=+0.389471495 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 1 03:53:27 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:53:27 localhost systemd[1]: tmp-crun.xdLVD2.mount: Deactivated successfully. Feb 1 03:53:36 localhost sshd[101993]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:53:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:53:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:53:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:53:38 localhost systemd[1]: tmp-crun.YGEJQS.mount: Deactivated successfully. Feb 1 03:53:38 localhost podman[101995]: 2026-02-01 08:53:38.111958183 +0000 UTC m=+0.098682850 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, config_id=tripleo_step3, container_name=collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:53:38 localhost podman[101995]: 2026-02-01 08:53:38.128260543 +0000 UTC m=+0.114985210 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, version=17.1.13, container_name=collectd, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:53:38 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:53:38 localhost systemd[1]: tmp-crun.80gbU5.mount: Deactivated successfully. Feb 1 03:53:38 localhost podman[101996]: 2026-02-01 08:53:38.218264396 +0000 UTC m=+0.203486787 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 1 03:53:38 localhost podman[101997]: 2026-02-01 08:53:38.27478874 +0000 UTC m=+0.251904022 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 1 03:53:38 localhost podman[101997]: 2026-02-01 08:53:38.289336287 +0000 UTC m=+0.266451629 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:53:38 localhost podman[101996]: 2026-02-01 08:53:38.301271354 +0000 UTC m=+0.286493735 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:53:38 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:53:38 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:53:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:53:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:53:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:53:49 localhost podman[102060]: 2026-02-01 08:53:49.711854416 +0000 UTC m=+0.071690892 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team) Feb 1 03:53:49 localhost podman[102061]: 2026-02-01 08:53:49.76380007 +0000 UTC m=+0.116798066 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, version=17.1.13, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:53:49 localhost podman[102060]: 2026-02-01 08:53:49.789287822 +0000 UTC m=+0.149124328 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:53:49 localhost podman[102061]: 2026-02-01 08:53:49.797163944 +0000 UTC m=+0.150161910 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:53:49 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:53:49 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:53:49 localhost podman[102062]: 2026-02-01 08:53:49.880501872 +0000 UTC m=+0.231923170 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 1 03:53:49 localhost podman[102062]: 2026-02-01 08:53:49.937768209 +0000 UTC m=+0.289189477 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5) Feb 1 03:53:49 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:53:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:53:55 localhost podman[102133]: 2026-02-01 08:53:55.714932902 +0000 UTC m=+0.078084898 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, url=https://www.redhat.com) Feb 1 03:53:56 localhost podman[102133]: 2026-02-01 08:53:56.071373952 +0000 UTC m=+0.434525938 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, container_name=nova_migration_target, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:53:56 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:53:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:53:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:53:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:53:57 localhost systemd[1]: tmp-crun.yN8T3w.mount: Deactivated successfully. Feb 1 03:53:57 localhost podman[102159]: 2026-02-01 08:53:57.728392992 +0000 UTC m=+0.073759555 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:53:57 localhost podman[102159]: 2026-02-01 08:53:57.746313673 +0000 UTC m=+0.091680256 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container) Feb 1 03:53:57 localhost podman[102159]: unhealthy Feb 1 03:53:57 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:53:57 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:53:57 localhost podman[102158]: 2026-02-01 08:53:57.730848927 +0000 UTC m=+0.083387500 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:53:57 localhost podman[102158]: 2026-02-01 08:53:57.814714882 +0000 UTC m=+0.167253505 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com) Feb 1 03:53:57 localhost podman[102158]: unhealthy Feb 1 03:53:57 localhost podman[102157]: 2026-02-01 08:53:57.823368837 +0000 UTC m=+0.176970272 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, architecture=x86_64, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z) Feb 1 03:53:57 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:53:57 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:53:58 localhost podman[102157]: 2026-02-01 08:53:58.046377263 +0000 UTC m=+0.399978698 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 1 03:53:58 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:54:08 localhost podman[102225]: 2026-02-01 08:54:08.711056603 +0000 UTC m=+0.067977547 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, container_name=collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.13, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container) Feb 1 03:54:08 localhost systemd[1]: tmp-crun.xotXve.mount: Deactivated successfully. Feb 1 03:54:08 localhost podman[102226]: 2026-02-01 08:54:08.739038172 +0000 UTC m=+0.087995053 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:54:08 localhost podman[102232]: 2026-02-01 08:54:08.765806103 +0000 UTC m=+0.113332779 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:54:08 localhost podman[102226]: 2026-02-01 08:54:08.771472327 +0000 UTC m=+0.120429208 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step5, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container) Feb 1 03:54:08 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:54:08 localhost podman[102225]: 2026-02-01 08:54:08.794298998 +0000 UTC m=+0.151219932 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:54:08 localhost podman[102232]: 2026-02-01 08:54:08.802318083 +0000 UTC m=+0.149844759 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, container_name=iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.openshift.expose-services=) Feb 1 03:54:08 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:54:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:54:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:54:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:54:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:54:20 localhost podman[102366]: 2026-02-01 08:54:20.736793121 +0000 UTC m=+0.093984736 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:54:20 localhost podman[102368]: 2026-02-01 08:54:20.789947372 +0000 UTC m=+0.142644239 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, release=1766032510, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:54:20 localhost systemd[1]: tmp-crun.AJgYB0.mount: Deactivated successfully. Feb 1 03:54:20 localhost podman[102367]: 2026-02-01 08:54:20.838948786 +0000 UTC m=+0.193902642 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container) Feb 1 03:54:20 localhost podman[102367]: 2026-02-01 08:54:20.848408516 +0000 UTC m=+0.203362372 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:54:20 localhost podman[102366]: 2026-02-01 08:54:20.857914638 +0000 UTC m=+0.215106253 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z) Feb 1 03:54:20 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:54:20 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:54:20 localhost podman[102368]: 2026-02-01 08:54:20.899256647 +0000 UTC m=+0.251953554 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:54:20 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:54:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:54:26 localhost podman[102439]: 2026-02-01 08:54:26.716686736 +0000 UTC m=+0.078047497 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, version=17.1.13, release=1766032510, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1) Feb 1 03:54:27 localhost podman[102439]: 2026-02-01 08:54:27.178504391 +0000 UTC m=+0.539865172 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:54:27 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:54:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:54:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:54:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:54:28 localhost podman[102464]: 2026-02-01 08:54:28.731081706 +0000 UTC m=+0.081489692 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 1 03:54:28 localhost podman[102464]: 2026-02-01 08:54:28.748534671 +0000 UTC m=+0.098942657 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z) Feb 1 03:54:28 localhost podman[102464]: unhealthy Feb 1 03:54:28 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:54:28 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:54:28 localhost podman[102463]: 2026-02-01 08:54:28.834347815 +0000 UTC m=+0.188975040 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 1 03:54:28 localhost podman[102462]: 2026-02-01 08:54:28.71069673 +0000 UTC m=+0.071828466 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:54:28 localhost podman[102463]: 2026-02-01 08:54:28.877431357 +0000 UTC m=+0.232058552 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, container_name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:54:28 localhost podman[102463]: unhealthy Feb 1 03:54:28 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:54:28 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:54:28 localhost podman[102462]: 2026-02-01 08:54:28.978603254 +0000 UTC m=+0.339734970 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd) Feb 1 03:54:28 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:54:39 localhost systemd[1]: tmp-crun.rU6l1d.mount: Deactivated successfully. Feb 1 03:54:39 localhost podman[102531]: 2026-02-01 08:54:39.784911739 +0000 UTC m=+0.135195080 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:54:39 localhost podman[102531]: 2026-02-01 08:54:39.841520586 +0000 UTC m=+0.191803927 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, release=1766032510, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:54:39 localhost podman[102532]: 2026-02-01 08:54:39.847690636 +0000 UTC m=+0.193227031 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, architecture=x86_64) Feb 1 03:54:39 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Deactivated successfully. Feb 1 03:54:39 localhost podman[102530]: 2026-02-01 08:54:39.759082826 +0000 UTC m=+0.112384640 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, vendor=Red Hat, Inc.) Feb 1 03:54:39 localhost podman[102532]: 2026-02-01 08:54:39.887402395 +0000 UTC m=+0.232938720 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.buildah.version=1.41.5, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:54:39 localhost podman[102530]: 2026-02-01 08:54:39.893705829 +0000 UTC m=+0.247007693 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:54:39 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:54:39 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:54:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:54:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:54:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:54:51 localhost systemd[1]: tmp-crun.qteEAT.mount: Deactivated successfully. Feb 1 03:54:51 localhost podman[102597]: 2026-02-01 08:54:51.71888909 +0000 UTC m=+0.078088218 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Feb 1 03:54:51 localhost podman[102597]: 2026-02-01 08:54:51.724293026 +0000 UTC m=+0.083492154 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true) Feb 1 03:54:51 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:54:51 localhost podman[102603]: 2026-02-01 08:54:51.805296343 +0000 UTC m=+0.157972020 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z) Feb 1 03:54:51 localhost podman[102603]: 2026-02-01 08:54:51.828429413 +0000 UTC m=+0.181105060 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:54:51 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:54:51 localhost podman[102596]: 2026-02-01 08:54:51.905635853 +0000 UTC m=+0.268901625 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64) Feb 1 03:54:51 localhost podman[102596]: 2026-02-01 08:54:51.929923298 +0000 UTC m=+0.293189090 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_compute) Feb 1 03:54:51 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:54:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:54:57 localhost podman[102669]: 2026-02-01 08:54:57.719322577 +0000 UTC m=+0.081094810 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510) Feb 1 03:54:58 localhost podman[102669]: 2026-02-01 08:54:58.048956825 +0000 UTC m=+0.410728978 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public) Feb 1 03:54:58 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:54:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:54:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:54:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:54:59 localhost systemd[1]: tmp-crun.pKP2zC.mount: Deactivated successfully. Feb 1 03:54:59 localhost podman[102693]: 2026-02-01 08:54:59.729735716 +0000 UTC m=+0.085876307 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, managed_by=tripleo_ansible) Feb 1 03:54:59 localhost systemd[1]: tmp-crun.FaZarY.mount: Deactivated successfully. Feb 1 03:54:59 localhost podman[102695]: 2026-02-01 08:54:59.785431296 +0000 UTC m=+0.136122820 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, container_name=ovn_controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:54:59 localhost podman[102695]: 2026-02-01 08:54:59.823416681 +0000 UTC m=+0.174108155 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, distribution-scope=public, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:54:59 localhost podman[102695]: unhealthy Feb 1 03:54:59 localhost podman[102694]: 2026-02-01 08:54:59.83513466 +0000 UTC m=+0.187406943 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:54:59 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:54:59 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:54:59 localhost podman[102694]: 2026-02-01 08:54:59.870872128 +0000 UTC m=+0.223144431 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Feb 1 03:54:59 localhost podman[102694]: unhealthy Feb 1 03:54:59 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:54:59 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:54:59 localhost podman[102693]: 2026-02-01 08:54:59.955383841 +0000 UTC m=+0.311524512 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:54:59 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:55:09 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:55:09 localhost recover_tripleo_nova_virtqemud[102763]: 61284 Feb 1 03:55:09 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:55:09 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:55:10 localhost podman[102765]: 2026-02-01 08:55:10.732087383 +0000 UTC m=+0.085370521 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc.) Feb 1 03:55:10 localhost systemd[1]: tmp-crun.4YyTS3.mount: Deactivated successfully. Feb 1 03:55:10 localhost podman[102764]: 2026-02-01 08:55:10.788943848 +0000 UTC m=+0.143000621 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1) Feb 1 03:55:10 localhost podman[102765]: 2026-02-01 08:55:10.80626504 +0000 UTC m=+0.159548138 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container) Feb 1 03:55:10 localhost podman[102765]: unhealthy Feb 1 03:55:10 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:55:10 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:55:10 localhost podman[102764]: 2026-02-01 08:55:10.823798728 +0000 UTC m=+0.177855501 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 1 03:55:10 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:55:10 localhost podman[102766]: 2026-02-01 08:55:10.895214019 +0000 UTC m=+0.243140443 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=iscsid, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 1 03:55:10 localhost podman[102766]: 2026-02-01 08:55:10.909173488 +0000 UTC m=+0.257099892 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:55:10 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:55:12 localhost sshd[102822]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:55:15 localhost systemd[1]: tmp-crun.RKesAT.mount: Deactivated successfully. Feb 1 03:55:15 localhost podman[102927]: 2026-02-01 08:55:15.284060922 +0000 UTC m=+0.099193965 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, vcs-type=git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhceph, build-date=2025-12-08T17:28:53Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 03:55:15 localhost podman[102927]: 2026-02-01 08:55:15.403452727 +0000 UTC m=+0.218585770 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, ceph=True, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=) Feb 1 03:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:55:22 localhost podman[103073]: 2026-02-01 08:55:22.734807438 +0000 UTC m=+0.092973156 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:55:22 localhost podman[103074]: 2026-02-01 08:55:22.779759997 +0000 UTC m=+0.134212560 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 1 03:55:22 localhost podman[103072]: 2026-02-01 08:55:22.836468328 +0000 UTC m=+0.195419310 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public) Feb 1 03:55:22 localhost podman[103074]: 2026-02-01 08:55:22.841405739 +0000 UTC m=+0.195858282 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 1 03:55:22 localhost podman[103073]: 2026-02-01 08:55:22.849943751 +0000 UTC m=+0.208109469 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:55:22 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:55:22 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:55:22 localhost podman[103072]: 2026-02-01 08:55:22.868208861 +0000 UTC m=+0.227159803 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:55:22 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:55:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:55:28 localhost podman[103145]: 2026-02-01 08:55:28.723267452 +0000 UTC m=+0.082200073 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 1 03:55:29 localhost podman[103145]: 2026-02-01 08:55:29.070052387 +0000 UTC m=+0.428984928 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 1 03:55:29 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:55:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:55:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:55:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:55:30 localhost systemd[1]: tmp-crun.l9Wiso.mount: Deactivated successfully. Feb 1 03:55:30 localhost podman[103169]: 2026-02-01 08:55:30.71543409 +0000 UTC m=+0.075161037 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13) Feb 1 03:55:30 localhost podman[103168]: 2026-02-01 08:55:30.775068721 +0000 UTC m=+0.134487769 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 1 03:55:30 localhost podman[103170]: 2026-02-01 08:55:30.740739708 +0000 UTC m=+0.094650006 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, tcib_managed=true) Feb 1 03:55:30 localhost podman[103169]: 2026-02-01 08:55:30.804347379 +0000 UTC m=+0.164074336 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2026-01-12T22:56:19Z, vcs-type=git, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:55:30 localhost podman[103169]: unhealthy Feb 1 03:55:30 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:55:30 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:55:30 localhost podman[103170]: 2026-02-01 08:55:30.820905678 +0000 UTC m=+0.174816006 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 1 03:55:30 localhost podman[103170]: unhealthy Feb 1 03:55:30 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:55:30 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:55:30 localhost podman[103168]: 2026-02-01 08:55:30.970170259 +0000 UTC m=+0.329589287 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z) Feb 1 03:55:30 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:55:41 localhost systemd[1]: tmp-crun.k7azYB.mount: Deactivated successfully. Feb 1 03:55:41 localhost podman[103241]: 2026-02-01 08:55:41.743388182 +0000 UTC m=+0.093301875 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3) Feb 1 03:55:41 localhost podman[103239]: 2026-02-01 08:55:41.718292741 +0000 UTC m=+0.078172880 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=collectd, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Feb 1 03:55:41 localhost podman[103241]: 2026-02-01 08:55:41.78049969 +0000 UTC m=+0.130413373 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team) Feb 1 03:55:41 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:55:41 localhost podman[103240]: 2026-02-01 08:55:41.833191608 +0000 UTC m=+0.186474084 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public) Feb 1 03:55:41 localhost podman[103239]: 2026-02-01 08:55:41.853910695 +0000 UTC m=+0.213790824 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 1 03:55:41 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:55:41 localhost podman[103240]: 2026-02-01 08:55:41.909281343 +0000 UTC m=+0.262563819 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.41.5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13) Feb 1 03:55:41 localhost podman[103240]: unhealthy Feb 1 03:55:41 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:55:41 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:55:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:55:53 localhost podman[103297]: 2026-02-01 08:55:53.728229793 +0000 UTC m=+0.081185133 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Feb 1 03:55:53 localhost podman[103297]: 2026-02-01 08:55:53.78742912 +0000 UTC m=+0.140384420 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 1 03:55:53 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:55:53 localhost podman[103295]: 2026-02-01 08:55:53.787250744 +0000 UTC m=+0.145800756 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5) Feb 1 03:55:53 localhost podman[103296]: 2026-02-01 08:55:53.845183963 +0000 UTC m=+0.200509746 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 1 03:55:53 localhost podman[103296]: 2026-02-01 08:55:53.856321505 +0000 UTC m=+0.211647278 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 1 03:55:53 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:55:53 localhost podman[103295]: 2026-02-01 08:55:53.872576654 +0000 UTC m=+0.231126626 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:55:53 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:55:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13153 DF PROTO=TCP SPT=53526 DPT=9882 SEQ=2486765448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE36D640000000001030307) Feb 1 03:55:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13154 DF PROTO=TCP SPT=53526 DPT=9882 SEQ=2486765448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE371780000000001030307) Feb 1 03:55:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13155 DF PROTO=TCP SPT=53526 DPT=9882 SEQ=2486765448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE379780000000001030307) Feb 1 03:55:57 localhost sshd[103367]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:55:57 localhost systemd-logind[759]: New session 36 of user zuul. Feb 1 03:55:57 localhost systemd[1]: Started Session 36 of User zuul. Feb 1 03:55:58 localhost python3.9[103462]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 03:55:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:55:59 localhost podman[103557]: 2026-02-01 08:55:59.275675876 +0000 UTC m=+0.092076606 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, container_name=nova_migration_target, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:55:59 localhost python3.9[103556]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 03:55:59 localhost podman[103557]: 2026-02-01 08:55:59.642511896 +0000 UTC m=+0.458912626 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, distribution-scope=public, container_name=nova_migration_target, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1766032510, build-date=2026-01-12T23:32:04Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:55:59 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:56:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26157 DF PROTO=TCP SPT=34362 DPT=9101 SEQ=2178277529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE385390000000001030307) Feb 1 03:56:00 localhost python3.9[103673]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 03:56:00 localhost python3.9[103767]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 03:56:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26158 DF PROTO=TCP SPT=34362 DPT=9101 SEQ=2178277529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE389390000000001030307) Feb 1 03:56:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13156 DF PROTO=TCP SPT=53526 DPT=9882 SEQ=2486765448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE389390000000001030307) Feb 1 03:56:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:56:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:56:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:56:01 localhost systemd[1]: tmp-crun.WOpWm5.mount: Deactivated successfully. Feb 1 03:56:01 localhost systemd[1]: tmp-crun.vOrdk5.mount: Deactivated successfully. Feb 1 03:56:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16014 DF PROTO=TCP SPT=51598 DPT=9105 SEQ=3322006709 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE38AE00000000001030307) Feb 1 03:56:01 localhost podman[103861]: 2026-02-01 08:56:01.504598701 +0000 UTC m=+0.146552639 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:56:01 localhost podman[103862]: 2026-02-01 08:56:01.460430125 +0000 UTC m=+0.099884386 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:56:01 localhost python3.9[103860]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 03:56:01 localhost podman[103862]: 2026-02-01 08:56:01.542558147 +0000 UTC m=+0.182012428 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Feb 1 03:56:01 localhost podman[103862]: unhealthy Feb 1 03:56:01 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:56:01 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:56:01 localhost podman[103863]: 2026-02-01 08:56:01.600521545 +0000 UTC m=+0.236726306 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public) Feb 1 03:56:01 localhost podman[103863]: 2026-02-01 08:56:01.639497642 +0000 UTC m=+0.275702413 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:56:01 localhost podman[103863]: unhealthy Feb 1 03:56:01 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:56:01 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:56:01 localhost podman[103861]: 2026-02-01 08:56:01.704363012 +0000 UTC m=+0.346316910 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:14Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step1) Feb 1 03:56:01 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:56:02 localhost python3.9[104023]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Feb 1 03:56:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16015 DF PROTO=TCP SPT=51598 DPT=9105 SEQ=3322006709 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE38EF80000000001030307) Feb 1 03:56:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26159 DF PROTO=TCP SPT=34362 DPT=9101 SEQ=2178277529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE391380000000001030307) Feb 1 03:56:04 localhost python3.9[104113]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 03:56:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16016 DF PROTO=TCP SPT=51598 DPT=9105 SEQ=3322006709 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE396F80000000001030307) Feb 1 03:56:04 localhost python3.9[104205]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Feb 1 03:56:05 localhost python3.9[104295]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 03:56:06 localhost python3.9[104343]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 03:56:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26160 DF PROTO=TCP SPT=34362 DPT=9101 SEQ=2178277529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3A0F80000000001030307) Feb 1 03:56:07 localhost systemd[1]: session-36.scope: Deactivated successfully. Feb 1 03:56:07 localhost systemd[1]: session-36.scope: Consumed 4.839s CPU time. Feb 1 03:56:07 localhost systemd-logind[759]: Session 36 logged out. Waiting for processes to exit. Feb 1 03:56:07 localhost systemd-logind[759]: Removed session 36. Feb 1 03:56:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17792 DF PROTO=TCP SPT=40122 DPT=9100 SEQ=1193686031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3A1370000000001030307) Feb 1 03:56:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38069 DF PROTO=TCP SPT=58598 DPT=9102 SEQ=318061118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3A2D10000000001030307) Feb 1 03:56:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17793 DF PROTO=TCP SPT=40122 DPT=9100 SEQ=1193686031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3A5380000000001030307) Feb 1 03:56:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16017 DF PROTO=TCP SPT=51598 DPT=9105 SEQ=3322006709 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3A6B80000000001030307) Feb 1 03:56:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38070 DF PROTO=TCP SPT=58598 DPT=9102 SEQ=318061118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3A6F80000000001030307) Feb 1 03:56:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13157 DF PROTO=TCP SPT=53526 DPT=9882 SEQ=2486765448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3A9B80000000001030307) Feb 1 03:56:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17794 DF PROTO=TCP SPT=40122 DPT=9100 SEQ=1193686031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3AD380000000001030307) Feb 1 03:56:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38071 DF PROTO=TCP SPT=58598 DPT=9102 SEQ=318061118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3AEF80000000001030307) Feb 1 03:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:56:12 localhost podman[104361]: 2026-02-01 08:56:12.738049582 +0000 UTC m=+0.088663773 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:56:12 localhost podman[104361]: 2026-02-01 08:56:12.774431939 +0000 UTC m=+0.125046100 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vcs-type=git, container_name=iscsid, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:56:12 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:56:12 localhost podman[104360]: 2026-02-01 08:56:12.791517453 +0000 UTC m=+0.146297541 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:56:12 localhost podman[104360]: 2026-02-01 08:56:12.841340793 +0000 UTC m=+0.196120861 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 1 03:56:12 localhost systemd[1]: tmp-crun.jPlWHq.mount: Deactivated successfully. Feb 1 03:56:12 localhost podman[104360]: unhealthy Feb 1 03:56:12 localhost podman[104359]: 2026-02-01 08:56:12.858401636 +0000 UTC m=+0.214261547 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Feb 1 03:56:12 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:56:12 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:56:12 localhost podman[104359]: 2026-02-01 08:56:12.869326082 +0000 UTC m=+0.225185993 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, config_id=tripleo_step3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1) Feb 1 03:56:12 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:56:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17795 DF PROTO=TCP SPT=40122 DPT=9100 SEQ=1193686031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3BCF80000000001030307) Feb 1 03:56:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38072 DF PROTO=TCP SPT=58598 DPT=9102 SEQ=318061118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3BEB80000000001030307) Feb 1 03:56:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26161 DF PROTO=TCP SPT=34362 DPT=9101 SEQ=2178277529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3C1B90000000001030307) Feb 1 03:56:17 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:56:17 localhost recover_tripleo_nova_virtqemud[104434]: 61284 Feb 1 03:56:17 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:56:17 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:56:22 localhost sshd[104496]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:56:22 localhost systemd-logind[759]: New session 37 of user zuul. Feb 1 03:56:22 localhost systemd[1]: Started Session 37 of User zuul. Feb 1 03:56:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17796 DF PROTO=TCP SPT=40122 DPT=9100 SEQ=1193686031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3DDB80000000001030307) Feb 1 03:56:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38073 DF PROTO=TCP SPT=58598 DPT=9102 SEQ=318061118 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3DFB80000000001030307) Feb 1 03:56:23 localhost python3.9[104591]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 03:56:23 localhost systemd[1]: Reloading. Feb 1 03:56:23 localhost systemd-sysv-generator[104621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:56:23 localhost systemd-rc-local-generator[104618]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:56:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:56:23 localhost systemd[1]: tmp-crun.7xy5Fo.mount: Deactivated successfully. Feb 1 03:56:23 localhost podman[104628]: 2026-02-01 08:56:23.915393381 +0000 UTC m=+0.077746888 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc.) Feb 1 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:56:23 localhost podman[104628]: 2026-02-01 08:56:23.949694783 +0000 UTC m=+0.112048270 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git) Feb 1 03:56:23 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:56:24 localhost podman[104662]: 2026-02-01 08:56:24.012951515 +0000 UTC m=+0.075515089 container health_status 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, architecture=x86_64, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 1 03:56:24 localhost podman[104662]: 2026-02-01 08:56:24.057851733 +0000 UTC m=+0.120415317 container exec_died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com) Feb 1 03:56:24 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Deactivated successfully. Feb 1 03:56:24 localhost podman[104663]: 2026-02-01 08:56:24.058605646 +0000 UTC m=+0.119284211 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:56:24 localhost podman[104663]: 2026-02-01 08:56:24.138584291 +0000 UTC m=+0.199262876 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-cron-container) Feb 1 03:56:24 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:56:24 localhost python3.9[104789]: ansible-ansible.builtin.service_facts Invoked Feb 1 03:56:24 localhost network[104806]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 03:56:24 localhost network[104807]: 'network-scripts' will be removed from distribution in near future. Feb 1 03:56:24 localhost network[104808]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 03:56:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42304 DF PROTO=TCP SPT=40696 DPT=9882 SEQ=12702202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3E6B80000000001030307) Feb 1 03:56:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:56:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42305 DF PROTO=TCP SPT=40696 DPT=9882 SEQ=12702202 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3EEB80000000001030307) Feb 1 03:56:30 localhost python3.9[105005]: ansible-ansible.builtin.service_facts Invoked Feb 1 03:56:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49504 DF PROTO=TCP SPT=36206 DPT=9101 SEQ=579797799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE3FA6A0000000001030307) Feb 1 03:56:30 localhost network[105022]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 03:56:30 localhost network[105023]: 'network-scripts' will be removed from distribution in near future. Feb 1 03:56:30 localhost network[105024]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 03:56:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:56:30 localhost podman[105030]: 2026-02-01 08:56:30.263318095 +0000 UTC m=+0.142717981 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:56:30 localhost podman[105030]: 2026-02-01 08:56:30.617440614 +0000 UTC m=+0.496840510 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 1 03:56:30 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:56:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:56:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:56:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:56:31 localhost podman[105098]: 2026-02-01 08:56:31.680422452 +0000 UTC m=+0.087101545 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 03:56:31 localhost podman[105098]: 2026-02-01 08:56:31.729335343 +0000 UTC m=+0.136014416 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public) Feb 1 03:56:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:56:31 localhost podman[105118]: 2026-02-01 08:56:31.780268196 +0000 UTC m=+0.089517378 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:56:31 localhost podman[105118]: 2026-02-01 08:56:31.797400913 +0000 UTC m=+0.106650085 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git) Feb 1 03:56:31 localhost podman[105118]: unhealthy Feb 1 03:56:31 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:56:31 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:56:31 localhost podman[105098]: unhealthy Feb 1 03:56:31 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:56:31 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:56:31 localhost podman[105136]: 2026-02-01 08:56:31.890204751 +0000 UTC m=+0.139167653 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true) Feb 1 03:56:32 localhost podman[105136]: 2026-02-01 08:56:32.094322656 +0000 UTC m=+0.343285548 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 1 03:56:32 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:56:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49506 DF PROTO=TCP SPT=36206 DPT=9101 SEQ=579797799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE406780000000001030307) Feb 1 03:56:34 localhost python3.9[105315]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:56:34 localhost systemd[1]: Reloading. Feb 1 03:56:34 localhost systemd-rc-local-generator[105338]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:56:34 localhost systemd-sysv-generator[105346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:56:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:56:34 localhost systemd[1]: Stopping ceilometer_agent_compute container... Feb 1 03:56:35 localhost sshd[105368]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:56:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49507 DF PROTO=TCP SPT=36206 DPT=9101 SEQ=579797799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE416380000000001030307) Feb 1 03:56:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17797 DF PROTO=TCP SPT=40122 DPT=9100 SEQ=1193686031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE41DB90000000001030307) Feb 1 03:56:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:56:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:56:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:56:43 localhost systemd[1]: tmp-crun.O220ub.mount: Deactivated successfully. Feb 1 03:56:43 localhost podman[105377]: 2026-02-01 08:56:43.009979752 +0000 UTC m=+0.098474904 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:56:43 localhost podman[105377]: 2026-02-01 08:56:43.018449992 +0000 UTC m=+0.106945154 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, vcs-type=git, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3) Feb 1 03:56:43 localhost systemd[1]: tmp-crun.ySHeiX.mount: Deactivated successfully. Feb 1 03:56:43 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:56:43 localhost podman[105370]: 2026-02-01 08:56:43.032477182 +0000 UTC m=+0.130946750 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 1 03:56:43 localhost podman[105370]: 2026-02-01 08:56:43.045075689 +0000 UTC m=+0.143545297 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:56:43 localhost podman[105370]: unhealthy Feb 1 03:56:43 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:56:43 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:56:43 localhost podman[105371]: 2026-02-01 08:56:43.076417931 +0000 UTC m=+0.166945285 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc.) Feb 1 03:56:43 localhost podman[105371]: 2026-02-01 08:56:43.097618871 +0000 UTC m=+0.188146235 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, tcib_managed=true, container_name=iscsid, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team) Feb 1 03:56:43 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:56:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3148 DF PROTO=TCP SPT=59440 DPT=9100 SEQ=4136378926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE432390000000001030307) Feb 1 03:56:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49508 DF PROTO=TCP SPT=36206 DPT=9101 SEQ=579797799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE435B80000000001030307) Feb 1 03:56:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3149 DF PROTO=TCP SPT=59440 DPT=9100 SEQ=4136378926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE451B80000000001030307) Feb 1 03:56:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17871 DF PROTO=TCP SPT=49052 DPT=9102 SEQ=3110520854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE453B80000000001030307) Feb 1 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:56:54 localhost podman[105429]: 2026-02-01 08:56:54.224063929 +0000 UTC m=+0.086764164 container health_status 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64) Feb 1 03:56:54 localhost podman[105428]: Error: container 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 is not running Feb 1 03:56:54 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Main process exited, code=exited, status=125/n/a Feb 1 03:56:54 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Failed with result 'exit-code'. Feb 1 03:56:54 localhost podman[105429]: 2026-02-01 08:56:54.253532464 +0000 UTC m=+0.116232659 container exec_died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:56:54 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Deactivated successfully. Feb 1 03:56:54 localhost podman[105460]: 2026-02-01 08:56:54.315535927 +0000 UTC m=+0.083832145 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 03:56:54 localhost podman[105460]: 2026-02-01 08:56:54.324944995 +0000 UTC m=+0.093241213 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:56:54 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:56:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=902 DF PROTO=TCP SPT=58990 DPT=9882 SEQ=2579374085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE45BB80000000001030307) Feb 1 03:56:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=903 DF PROTO=TCP SPT=58990 DPT=9882 SEQ=2579374085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE463B80000000001030307) Feb 1 03:57:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31389 DF PROTO=TCP SPT=39324 DPT=9101 SEQ=190709003 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE46F9B0000000001030307) Feb 1 03:57:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:57:00 localhost podman[105487]: 2026-02-01 08:57:00.974319023 +0000 UTC m=+0.085004051 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 1 03:57:01 localhost podman[105487]: 2026-02-01 08:57:01.302611209 +0000 UTC m=+0.413296297 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:57:01 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:57:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:57:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:57:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:57:02 localhost systemd[1]: tmp-crun.kGKLWT.mount: Deactivated successfully. Feb 1 03:57:02 localhost podman[105511]: 2026-02-01 08:57:02.704903102 +0000 UTC m=+0.066159152 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 1 03:57:02 localhost podman[105512]: 2026-02-01 08:57:02.720083517 +0000 UTC m=+0.077471179 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 1 03:57:02 localhost podman[105513]: 2026-02-01 08:57:02.748053886 +0000 UTC m=+0.105417017 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 03:57:02 localhost podman[105512]: 2026-02-01 08:57:02.758237069 +0000 UTC m=+0.115624741 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z) Feb 1 03:57:02 localhost podman[105512]: unhealthy Feb 1 03:57:02 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:57:02 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:57:02 localhost podman[105513]: 2026-02-01 08:57:02.782918065 +0000 UTC m=+0.140281206 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller) Feb 1 03:57:02 localhost podman[105513]: unhealthy Feb 1 03:57:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:57:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:57:02 localhost podman[105511]: 2026-02-01 08:57:02.907651744 +0000 UTC m=+0.268907844 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, version=17.1.13, container_name=metrics_qdr, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:57:02 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:57:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26975 DF PROTO=TCP SPT=48952 DPT=9105 SEQ=2212905655 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE47BB80000000001030307) Feb 1 03:57:03 localhost systemd[1]: tmp-crun.LCMQeC.mount: Deactivated successfully. Feb 1 03:57:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31392 DF PROTO=TCP SPT=39324 DPT=9101 SEQ=190709003 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE48B780000000001030307) Feb 1 03:57:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=905 DF PROTO=TCP SPT=58990 DPT=9882 SEQ=2579374085 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE493B80000000001030307) Feb 1 03:57:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:57:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:57:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:57:13 localhost podman[105579]: 2026-02-01 08:57:13.511433585 +0000 UTC m=+0.071711073 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 1 03:57:13 localhost podman[105579]: 2026-02-01 08:57:13.523233797 +0000 UTC m=+0.083511315 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:57:13 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:57:13 localhost podman[105577]: 2026-02-01 08:57:13.557729156 +0000 UTC m=+0.128917858 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, container_name=collectd) Feb 1 03:57:13 localhost podman[105577]: 2026-02-01 08:57:13.568462045 +0000 UTC m=+0.139650757 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, container_name=collectd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, version=17.1.13) Feb 1 03:57:13 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:57:13 localhost podman[105578]: 2026-02-01 08:57:13.616347686 +0000 UTC m=+0.183238046 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, container_name=nova_compute, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:57:13 localhost podman[105578]: 2026-02-01 08:57:13.666519075 +0000 UTC m=+0.233409475 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, container_name=nova_compute, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute) Feb 1 03:57:13 localhost podman[105578]: unhealthy Feb 1 03:57:13 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:57:13 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:57:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5641 DF PROTO=TCP SPT=42714 DPT=9100 SEQ=2561254508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE4A7780000000001030307) Feb 1 03:57:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31393 DF PROTO=TCP SPT=39324 DPT=9101 SEQ=190709003 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE4ABB90000000001030307) Feb 1 03:57:16 localhost podman[105355]: time="2026-02-01T08:57:16Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Feb 1 03:57:16 localhost systemd[1]: libpod-857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.scope: Deactivated successfully. Feb 1 03:57:16 localhost systemd[1]: libpod-857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.scope: Consumed 6.390s CPU time. Feb 1 03:57:16 localhost podman[105355]: 2026-02-01 08:57:16.777883564 +0000 UTC m=+42.088829827 container died 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:57:16 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.timer: Deactivated successfully. Feb 1 03:57:16 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233. Feb 1 03:57:16 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Failed to open /run/systemd/transient/857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: No such file or directory Feb 1 03:57:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233-userdata-shm.mount: Deactivated successfully. Feb 1 03:57:16 localhost systemd[1]: var-lib-containers-storage-overlay-7d44c44bcfe4704b931236c6c12b33c33d5721caf0abe0082648700db793c9ca-merged.mount: Deactivated successfully. Feb 1 03:57:16 localhost podman[105355]: 2026-02-01 08:57:16.832706747 +0000 UTC m=+42.143653010 container cleanup 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team) Feb 1 03:57:16 localhost podman[105355]: ceilometer_agent_compute Feb 1 03:57:16 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.timer: Failed to open /run/systemd/transient/857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.timer: No such file or directory Feb 1 03:57:16 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Failed to open /run/systemd/transient/857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: No such file or directory Feb 1 03:57:16 localhost podman[105635]: 2026-02-01 08:57:16.90382986 +0000 UTC m=+0.115565868 container cleanup 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13) Feb 1 03:57:16 localhost systemd[1]: libpod-conmon-857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.scope: Deactivated successfully. Feb 1 03:57:17 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.timer: Failed to open /run/systemd/transient/857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.timer: No such file or directory Feb 1 03:57:17 localhost systemd[1]: 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: Failed to open /run/systemd/transient/857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233.service: No such file or directory Feb 1 03:57:17 localhost podman[105650]: 2026-02-01 08:57:17.014463806 +0000 UTC m=+0.075015404 container cleanup 857e69bc164464828a039d5f099473fe615657340aa94125e713e78ccd2da233 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 1 03:57:17 localhost podman[105650]: ceilometer_agent_compute Feb 1 03:57:17 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Feb 1 03:57:17 localhost systemd[1]: Stopped ceilometer_agent_compute container. Feb 1 03:57:17 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.117s CPU time, no IO. Feb 1 03:57:17 localhost python3.9[105755]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:57:17 localhost systemd[1]: Reloading. Feb 1 03:57:18 localhost systemd-rc-local-generator[105784]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:57:18 localhost systemd-sysv-generator[105787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:57:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:57:18 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Feb 1 03:57:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:a1:06:ee MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=60280 SEQ=3413246832 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 1 03:57:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5642 DF PROTO=TCP SPT=42714 DPT=9100 SEQ=2561254508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE4C7B80000000001030307) Feb 1 03:57:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:57:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:57:24 localhost podman[105888]: Error: container 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 is not running Feb 1 03:57:24 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Main process exited, code=exited, status=125/n/a Feb 1 03:57:24 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Failed with result 'exit-code'. Feb 1 03:57:24 localhost podman[105887]: 2026-02-01 08:57:24.54277644 +0000 UTC m=+0.148009134 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:57:24 localhost podman[105887]: 2026-02-01 08:57:24.584870652 +0000 UTC m=+0.190103296 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com) Feb 1 03:57:24 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:57:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54927 DF PROTO=TCP SPT=48156 DPT=9882 SEQ=1912977415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE4D0F80000000001030307) Feb 1 03:57:25 localhost sshd[105918]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:57:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54928 DF PROTO=TCP SPT=48156 DPT=9882 SEQ=1912977415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE4D8F80000000001030307) Feb 1 03:57:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59288 DF PROTO=TCP SPT=54172 DPT=9101 SEQ=3557695151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE4E4CA0000000001030307) Feb 1 03:57:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:57:31 localhost systemd[1]: tmp-crun.7H1PcN.mount: Deactivated successfully. Feb 1 03:57:31 localhost podman[105920]: 2026-02-01 08:57:31.4943362 +0000 UTC m=+0.095991337 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 1 03:57:31 localhost podman[105920]: 2026-02-01 08:57:31.880952717 +0000 UTC m=+0.482607854 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:57:31 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:57:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59290 DF PROTO=TCP SPT=54172 DPT=9101 SEQ=3557695151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE4F0B90000000001030307) Feb 1 03:57:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:57:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:57:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:57:33 localhost podman[105941]: 2026-02-01 08:57:33.7647986 +0000 UTC m=+0.121365286 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:57:33 localhost podman[105942]: 2026-02-01 08:57:33.829562448 +0000 UTC m=+0.183685489 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 03:57:33 localhost systemd[1]: tmp-crun.4yM5he.mount: Deactivated successfully. Feb 1 03:57:33 localhost podman[105942]: 2026-02-01 08:57:33.874475717 +0000 UTC m=+0.228598768 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 1 03:57:33 localhost podman[105942]: unhealthy Feb 1 03:57:33 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:57:33 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:57:33 localhost podman[105943]: 2026-02-01 08:57:33.874133626 +0000 UTC m=+0.223597804 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, release=1766032510, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, container_name=ovn_controller, vcs-type=git) Feb 1 03:57:33 localhost podman[105943]: 2026-02-01 08:57:33.960942011 +0000 UTC m=+0.310406179 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, container_name=ovn_controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4) Feb 1 03:57:33 localhost podman[105943]: unhealthy Feb 1 03:57:33 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:57:33 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:57:33 localhost podman[105941]: 2026-02-01 08:57:33.986593348 +0000 UTC m=+0.343160084 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container) Feb 1 03:57:34 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:57:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59291 DF PROTO=TCP SPT=54172 DPT=9101 SEQ=3557695151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE500790000000001030307) Feb 1 03:57:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5643 DF PROTO=TCP SPT=42714 DPT=9100 SEQ=2561254508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE507B80000000001030307) Feb 1 03:57:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:a1:06:ee MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=60280 SEQ=3413246832 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 1 03:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:57:43 localhost podman[106005]: 2026-02-01 08:57:43.719686868 +0000 UTC m=+0.076954812 container health_status 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 1 03:57:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:57:43 localhost podman[106006]: 2026-02-01 08:57:43.768309671 +0000 UTC m=+0.124667218 container health_status b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 1 03:57:43 localhost podman[106006]: 2026-02-01 08:57:43.782398614 +0000 UTC m=+0.138756131 container exec_died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:57:43 localhost podman[106005]: 2026-02-01 08:57:43.790758371 +0000 UTC m=+0.148026285 container exec_died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public) Feb 1 03:57:43 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Deactivated successfully. Feb 1 03:57:43 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Deactivated successfully. Feb 1 03:57:43 localhost systemd[1]: tmp-crun.clj5gP.mount: Deactivated successfully. Feb 1 03:57:43 localhost podman[106037]: 2026-02-01 08:57:43.889247153 +0000 UTC m=+0.148546390 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 1 03:57:43 localhost podman[106037]: 2026-02-01 08:57:43.916314705 +0000 UTC m=+0.175613922 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 1 03:57:43 localhost podman[106037]: unhealthy Feb 1 03:57:43 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:57:43 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:57:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59292 DF PROTO=TCP SPT=54172 DPT=9101 SEQ=3557695151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE521B90000000001030307) Feb 1 03:57:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1588 DF PROTO=TCP SPT=50126 DPT=9102 SEQ=2636278809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE53DB80000000001030307) Feb 1 03:57:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58109 DF PROTO=TCP SPT=43614 DPT=9100 SEQ=119634173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE53DB80000000001030307) Feb 1 03:57:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:57:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:57:54 localhost podman[106066]: 2026-02-01 08:57:54.734358212 +0000 UTC m=+0.093684506 container health_status 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:57:54 localhost podman[106067]: Error: container 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 is not running Feb 1 03:57:54 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Main process exited, code=exited, status=125/n/a Feb 1 03:57:54 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Failed with result 'exit-code'. Feb 1 03:57:54 localhost podman[106066]: 2026-02-01 08:57:54.770927244 +0000 UTC m=+0.130253478 container exec_died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, container_name=logrotate_crond, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:57:54 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Deactivated successfully. Feb 1 03:57:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2913 DF PROTO=TCP SPT=39540 DPT=9882 SEQ=3259200298 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE546380000000001030307) Feb 1 03:57:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2914 DF PROTO=TCP SPT=39540 DPT=9882 SEQ=3259200298 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE54E380000000001030307) Feb 1 03:57:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:57:57 localhost recover_tripleo_nova_virtqemud[106097]: 61284 Feb 1 03:57:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:57:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:58:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31259 DF PROTO=TCP SPT=42194 DPT=9101 SEQ=960335357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE559FB0000000001030307) Feb 1 03:58:00 localhost podman[105795]: time="2026-02-01T08:58:00Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Feb 1 03:58:00 localhost systemd[1]: tmp-crun.8m4g24.mount: Deactivated successfully. Feb 1 03:58:00 localhost systemd[1]: libpod-96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.scope: Deactivated successfully. Feb 1 03:58:00 localhost systemd[1]: libpod-96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.scope: Consumed 6.553s CPU time. Feb 1 03:58:00 localhost podman[105795]: 2026-02-01 08:58:00.425341661 +0000 UTC m=+42.105864931 container died 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-type=git) Feb 1 03:58:00 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.timer: Deactivated successfully. Feb 1 03:58:00 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1. Feb 1 03:58:00 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Failed to open /run/systemd/transient/96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: No such file or directory Feb 1 03:58:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1-userdata-shm.mount: Deactivated successfully. Feb 1 03:58:00 localhost podman[105795]: 2026-02-01 08:58:00.47420658 +0000 UTC m=+42.154729840 container cleanup 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vcs-type=git, release=1766032510, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64) Feb 1 03:58:00 localhost podman[105795]: ceilometer_agent_ipmi Feb 1 03:58:00 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.timer: Failed to open /run/systemd/transient/96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.timer: No such file or directory Feb 1 03:58:00 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Failed to open /run/systemd/transient/96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: No such file or directory Feb 1 03:58:00 localhost podman[106099]: 2026-02-01 08:58:00.505027806 +0000 UTC m=+0.072925649 container cleanup 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Feb 1 03:58:00 localhost systemd[1]: libpod-conmon-96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.scope: Deactivated successfully. Feb 1 03:58:00 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.timer: Failed to open /run/systemd/transient/96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.timer: No such file or directory Feb 1 03:58:00 localhost systemd[1]: 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: Failed to open /run/systemd/transient/96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1.service: No such file or directory Feb 1 03:58:00 localhost podman[106115]: 2026-02-01 08:58:00.60449585 +0000 UTC m=+0.066152522 container cleanup 96d815240aaef07bb2870974d8772a95ab8b5480aa8f17308761d7796ea9a7d1 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '56f18c3ee04e8cd5761527c0820290d2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:58:00 localhost podman[106115]: ceilometer_agent_ipmi Feb 1 03:58:00 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Feb 1 03:58:00 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Feb 1 03:58:01 localhost systemd[1]: var-lib-containers-storage-overlay-51d00c3910eefc6cf51faff5d5baa51463b7218ae9021f1d456ee4b881969174-merged.mount: Deactivated successfully. Feb 1 03:58:01 localhost python3.9[106219]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:01 localhost systemd[1]: Reloading. Feb 1 03:58:01 localhost systemd-rc-local-generator[106242]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:58:01 localhost systemd-sysv-generator[106245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:58:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:58:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:58:02 localhost systemd[1]: Stopping collectd container... Feb 1 03:58:02 localhost podman[106259]: 2026-02-01 08:58:02.135152772 +0000 UTC m=+0.099587889 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 1 03:58:02 localhost podman[106259]: 2026-02-01 08:58:02.561409785 +0000 UTC m=+0.525844902 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 1 03:58:02 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:58:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3347 DF PROTO=TCP SPT=38022 DPT=9105 SEQ=1306842807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE565B90000000001030307) Feb 1 03:58:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:58:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:58:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:58:04 localhost systemd[1]: tmp-crun.zzYDdw.mount: Deactivated successfully. Feb 1 03:58:04 localhost podman[106298]: 2026-02-01 08:58:04.251189051 +0000 UTC m=+0.094626605 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.13, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:58:04 localhost podman[106296]: 2026-02-01 08:58:04.282708929 +0000 UTC m=+0.135151130 container health_status 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:58:04 localhost podman[106297]: 2026-02-01 08:58:04.341677969 +0000 UTC m=+0.193411268 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 03:58:04 localhost podman[106297]: 2026-02-01 08:58:04.36032249 +0000 UTC m=+0.212055859 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:58:04 localhost podman[106297]: unhealthy Feb 1 03:58:04 localhost podman[106298]: 2026-02-01 08:58:04.367595914 +0000 UTC m=+0.211033448 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:58:04 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:58:04 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:58:04 localhost podman[106298]: unhealthy Feb 1 03:58:04 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:58:04 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:58:04 localhost podman[106296]: 2026-02-01 08:58:04.461442324 +0000 UTC m=+0.313884535 container exec_died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 1 03:58:04 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Deactivated successfully. Feb 1 03:58:05 localhost systemd[1]: libpod-02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.scope: Deactivated successfully. Feb 1 03:58:05 localhost systemd[1]: libpod-02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.scope: Consumed 2.266s CPU time. Feb 1 03:58:05 localhost podman[106260]: 2026-02-01 08:58:05.722250973 +0000 UTC m=+3.681223022 container died 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd) Feb 1 03:58:05 localhost systemd[1]: tmp-crun.qFBX3I.mount: Deactivated successfully. Feb 1 03:58:05 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.timer: Deactivated successfully. Feb 1 03:58:05 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7. Feb 1 03:58:05 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Failed to open /run/systemd/transient/02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: No such file or directory Feb 1 03:58:05 localhost podman[106260]: 2026-02-01 08:58:05.787769695 +0000 UTC m=+3.746741724 container cleanup 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 1 03:58:05 localhost podman[106260]: collectd Feb 1 03:58:05 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.timer: Failed to open /run/systemd/transient/02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.timer: No such file or directory Feb 1 03:58:05 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Failed to open /run/systemd/transient/02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: No such file or directory Feb 1 03:58:05 localhost podman[106363]: 2026-02-01 08:58:05.835130589 +0000 UTC m=+0.102505958 container cleanup 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public) Feb 1 03:58:05 localhost systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:58:05 localhost systemd[1]: libpod-conmon-02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.scope: Deactivated successfully. Feb 1 03:58:05 localhost podman[106392]: error opening file `/run/crun/02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7/status`: No such file or directory Feb 1 03:58:05 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.timer: Failed to open /run/systemd/transient/02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.timer: No such file or directory Feb 1 03:58:05 localhost systemd[1]: 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: Failed to open /run/systemd/transient/02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7.service: No such file or directory Feb 1 03:58:05 localhost podman[106380]: 2026-02-01 08:58:05.954735879 +0000 UTC m=+0.078371255 container cleanup 02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 1 03:58:05 localhost podman[106380]: collectd Feb 1 03:58:05 localhost systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'. Feb 1 03:58:05 localhost systemd[1]: Stopped collectd container. Feb 1 03:58:06 localhost systemd[1]: tmp-crun.S5Nfh0.mount: Deactivated successfully. Feb 1 03:58:06 localhost systemd[1]: var-lib-containers-storage-overlay-21a54a15634d928a79fc839f0a57c8a326674339b247c00b2db3239d9aad6a92-merged.mount: Deactivated successfully. Feb 1 03:58:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02310e3ab1f83877297cead532e690ea4df1944184a5223c59fef13b06322cb7-userdata-shm.mount: Deactivated successfully. Feb 1 03:58:06 localhost python3.9[106486]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:a1:06:ee MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=60294 SEQ=3728137527 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 1 03:58:07 localhost systemd[1]: Reloading. Feb 1 03:58:07 localhost systemd-rc-local-generator[106514]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:58:07 localhost systemd-sysv-generator[106517]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:58:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:58:08 localhost systemd[1]: Stopping iscsid container... Feb 1 03:58:08 localhost systemd[1]: tmp-crun.PRAeQh.mount: Deactivated successfully. Feb 1 03:58:08 localhost systemd[1]: libpod-b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.scope: Deactivated successfully. Feb 1 03:58:08 localhost systemd[1]: libpod-b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.scope: Consumed 1.127s CPU time. Feb 1 03:58:08 localhost podman[106528]: 2026-02-01 08:58:08.269481779 +0000 UTC m=+0.077832750 container died b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, com.redhat.component=openstack-iscsid-container, container_name=iscsid, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 1 03:58:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.timer: Deactivated successfully. Feb 1 03:58:08 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe. Feb 1 03:58:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Failed to open /run/systemd/transient/b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: No such file or directory Feb 1 03:58:08 localhost systemd[1]: tmp-crun.2YVddh.mount: Deactivated successfully. Feb 1 03:58:08 localhost podman[106528]: 2026-02-01 08:58:08.318281456 +0000 UTC m=+0.126632387 container cleanup b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 1 03:58:08 localhost podman[106528]: iscsid Feb 1 03:58:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.timer: Failed to open /run/systemd/transient/b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.timer: No such file or directory Feb 1 03:58:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Failed to open /run/systemd/transient/b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: No such file or directory Feb 1 03:58:08 localhost podman[106542]: 2026-02-01 08:58:08.335121553 +0000 UTC m=+0.060213709 container cleanup b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git) Feb 1 03:58:08 localhost systemd[1]: libpod-conmon-b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.scope: Deactivated successfully. Feb 1 03:58:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.timer: Failed to open /run/systemd/transient/b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.timer: No such file or directory Feb 1 03:58:08 localhost systemd[1]: b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: Failed to open /run/systemd/transient/b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe.service: No such file or directory Feb 1 03:58:08 localhost podman[106555]: 2026-02-01 08:58:08.424432064 +0000 UTC m=+0.063341595 container cleanup b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:58:08 localhost podman[106555]: iscsid Feb 1 03:58:08 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Feb 1 03:58:08 localhost systemd[1]: Stopped iscsid container. Feb 1 03:58:09 localhost python3.9[106658]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1589 DF PROTO=TCP SPT=50126 DPT=9102 SEQ=2636278809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE57DB80000000001030307) Feb 1 03:58:09 localhost systemd[1]: var-lib-containers-storage-overlay-e75586bb4ab2bd9f0af5c6046e55c6950ec71393a8ae3185df7c4d9365a6d82a-merged.mount: Deactivated successfully. Feb 1 03:58:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b06b2f5f3e7d1e2648b903bb3e319d6443238b032748f29f9df679dbeda70cfe-userdata-shm.mount: Deactivated successfully. Feb 1 03:58:10 localhost systemd[1]: Reloading. Feb 1 03:58:10 localhost systemd-rc-local-generator[106684]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:58:10 localhost systemd-sysv-generator[106690]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:58:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:58:10 localhost systemd[1]: Stopping logrotate_crond container... Feb 1 03:58:10 localhost systemd[1]: libpod-93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.scope: Deactivated successfully. Feb 1 03:58:10 localhost systemd[1]: libpod-93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.scope: Consumed 1.010s CPU time. Feb 1 03:58:10 localhost podman[106699]: 2026-02-01 08:58:10.724319838 +0000 UTC m=+0.076760697 container died 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 1 03:58:10 localhost systemd[1]: tmp-crun.jBkHni.mount: Deactivated successfully. Feb 1 03:58:10 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.timer: Deactivated successfully. Feb 1 03:58:10 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf. Feb 1 03:58:10 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Failed to open /run/systemd/transient/93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: No such file or directory Feb 1 03:58:10 localhost podman[106699]: 2026-02-01 08:58:10.791620413 +0000 UTC m=+0.144061192 container cleanup 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 1 03:58:10 localhost podman[106699]: logrotate_crond Feb 1 03:58:10 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.timer: Failed to open /run/systemd/transient/93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.timer: No such file or directory Feb 1 03:58:10 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Failed to open /run/systemd/transient/93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: No such file or directory Feb 1 03:58:10 localhost podman[106712]: 2026-02-01 08:58:10.822890083 +0000 UTC m=+0.089298492 container cleanup 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5) Feb 1 03:58:10 localhost systemd[1]: libpod-conmon-93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.scope: Deactivated successfully. Feb 1 03:58:10 localhost podman[106740]: error opening file `/run/crun/93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf/status`: No such file or directory Feb 1 03:58:10 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.timer: Failed to open /run/systemd/transient/93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.timer: No such file or directory Feb 1 03:58:10 localhost systemd[1]: 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: Failed to open /run/systemd/transient/93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf.service: No such file or directory Feb 1 03:58:10 localhost podman[106729]: 2026-02-01 08:58:10.926975288 +0000 UTC m=+0.070623529 container cleanup 93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, container_name=logrotate_crond) Feb 1 03:58:10 localhost podman[106729]: logrotate_crond Feb 1 03:58:10 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Feb 1 03:58:10 localhost systemd[1]: Stopped logrotate_crond container. Feb 1 03:58:11 localhost python3.9[106833]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:11 localhost systemd[1]: var-lib-containers-storage-overlay-d93858986242c6937d720b37f9e41284a4bd20ca36f5e363c143d874dccc14fb-merged.mount: Deactivated successfully. Feb 1 03:58:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-93b98f49cf006ead9573134e31df269d9748eb622f1e19790d799145bc1f7fdf-userdata-shm.mount: Deactivated successfully. Feb 1 03:58:11 localhost systemd[1]: Reloading. Feb 1 03:58:11 localhost systemd-rc-local-generator[106854]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:58:11 localhost systemd-sysv-generator[106862]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:58:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:58:12 localhost systemd[1]: Stopping metrics_qdr container... Feb 1 03:58:12 localhost kernel: qdrouterd[53994]: segfault at 0 ip 00007fdcd456f7cb sp 00007ffeba1f59a0 error 4 in libc.so.6[7fdcd450c000+175000] Feb 1 03:58:12 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Feb 1 03:58:12 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Feb 1 03:58:12 localhost systemd[1]: Started Process Core Dump (PID 106887/UID 0). Feb 1 03:58:12 localhost systemd-coredump[106888]: Resource limits disable core dumping for process 53994 (qdrouterd). Feb 1 03:58:12 localhost systemd-coredump[106888]: Process 53994 (qdrouterd) of user 42465 dumped core. Feb 1 03:58:12 localhost systemd[1]: systemd-coredump@0-106887-0.service: Deactivated successfully. Feb 1 03:58:12 localhost podman[106874]: 2026-02-01 08:58:12.362464759 +0000 UTC m=+0.238159231 container died 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:58:12 localhost systemd[1]: libpod-5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.scope: Deactivated successfully. Feb 1 03:58:12 localhost systemd[1]: libpod-5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.scope: Consumed 28.597s CPU time. Feb 1 03:58:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.timer: Deactivated successfully. Feb 1 03:58:12 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7. Feb 1 03:58:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Failed to open /run/systemd/transient/5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: No such file or directory Feb 1 03:58:12 localhost systemd[1]: tmp-crun.sMH6HL.mount: Deactivated successfully. Feb 1 03:58:12 localhost podman[106874]: 2026-02-01 08:58:12.418708746 +0000 UTC m=+0.294403198 container cleanup 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:58:12 localhost podman[106874]: metrics_qdr Feb 1 03:58:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.timer: Failed to open /run/systemd/transient/5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.timer: No such file or directory Feb 1 03:58:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Failed to open /run/systemd/transient/5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: No such file or directory Feb 1 03:58:12 localhost podman[106892]: 2026-02-01 08:58:12.441488465 +0000 UTC m=+0.066141312 container cleanup 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.13, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 1 03:58:12 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Feb 1 03:58:12 localhost systemd[1]: libpod-conmon-5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.scope: Deactivated successfully. Feb 1 03:58:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.timer: Failed to open /run/systemd/transient/5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.timer: No such file or directory Feb 1 03:58:12 localhost systemd[1]: 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: Failed to open /run/systemd/transient/5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7.service: No such file or directory Feb 1 03:58:12 localhost podman[106907]: 2026-02-01 08:58:12.529969511 +0000 UTC m=+0.059612252 container cleanup 5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '60dc9caaeb1b9ec4a6ba094f0fd24dbd'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:58:12 localhost podman[106907]: metrics_qdr Feb 1 03:58:12 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Feb 1 03:58:12 localhost systemd[1]: Stopped metrics_qdr container. Feb 1 03:58:12 localhost systemd[1]: var-lib-containers-storage-overlay-38608c39b920110534238513fb52c11220d545a8d3383f5c9cf411254a119b53-merged.mount: Deactivated successfully. Feb 1 03:58:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5cdaa48ba3e73b6ac0af9c3671bede60feaa42c18702808ecfd496a8083b1fc7-userdata-shm.mount: Deactivated successfully. Feb 1 03:58:13 localhost python3.9[107010]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:a1:06:ee MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=60294 SEQ=3728137527 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 1 03:58:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:58:14 localhost podman[107104]: 2026-02-01 08:58:14.104068136 +0000 UTC m=+0.090883711 container health_status 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com) Feb 1 03:58:14 localhost podman[107104]: 2026-02-01 08:58:14.125851934 +0000 UTC m=+0.112667549 container exec_died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, architecture=x86_64) Feb 1 03:58:14 localhost podman[107104]: unhealthy Feb 1 03:58:14 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:58:14 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:58:14 localhost python3.9[107103]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:15 localhost python3.9[107218]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31263 DF PROTO=TCP SPT=42194 DPT=9101 SEQ=960335357 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE595B80000000001030307) Feb 1 03:58:15 localhost python3.9[107311]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:58:16 localhost systemd[1]: Reloading. Feb 1 03:58:16 localhost systemd-rc-local-generator[107339]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:58:16 localhost systemd-sysv-generator[107342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:58:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:58:17 localhost systemd[1]: Stopping nova_compute container... Feb 1 03:58:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16349 DF PROTO=TCP SPT=41260 DPT=9100 SEQ=769599633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5B1B80000000001030307) Feb 1 03:58:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33209 DF PROTO=TCP SPT=55834 DPT=9102 SEQ=1754421042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5B3B90000000001030307) Feb 1 03:58:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12110 DF PROTO=TCP SPT=54112 DPT=9882 SEQ=2417246438 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5BB790000000001030307) Feb 1 03:58:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12111 DF PROTO=TCP SPT=54112 DPT=9882 SEQ=2417246438 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5C3790000000001030307) Feb 1 03:58:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19453 DF PROTO=TCP SPT=54702 DPT=9101 SEQ=2362081606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5CF460000000001030307) Feb 1 03:58:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:58:32 localhost podman[107441]: 2026-02-01 08:58:32.713929059 +0000 UTC m=+0.075195919 container health_status d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 03:58:33 localhost podman[107441]: 2026-02-01 08:58:33.106556011 +0000 UTC m=+0.467822931 container exec_died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z) Feb 1 03:58:33 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Deactivated successfully. Feb 1 03:58:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19455 DF PROTO=TCP SPT=54702 DPT=9101 SEQ=2362081606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5DB380000000001030307) Feb 1 03:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:58:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:58:34 localhost systemd[1]: tmp-crun.KY7er9.mount: Deactivated successfully. Feb 1 03:58:34 localhost podman[107464]: 2026-02-01 08:58:34.735918701 +0000 UTC m=+0.092713136 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z) Feb 1 03:58:34 localhost podman[107465]: 2026-02-01 08:58:34.784191644 +0000 UTC m=+0.139095991 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4) Feb 1 03:58:34 localhost podman[107465]: 2026-02-01 08:58:34.80039309 +0000 UTC m=+0.155297447 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 03:58:34 localhost podman[107464]: 2026-02-01 08:58:34.801025691 +0000 UTC m=+0.157820136 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 03:58:34 localhost podman[107464]: unhealthy Feb 1 03:58:34 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:58:34 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:58:34 localhost podman[107465]: unhealthy Feb 1 03:58:34 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:58:34 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:58:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19456 DF PROTO=TCP SPT=54702 DPT=9101 SEQ=2362081606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5EAF90000000001030307) Feb 1 03:58:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33210 DF PROTO=TCP SPT=55834 DPT=9102 SEQ=1754421042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE5F3B80000000001030307) Feb 1 03:58:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15354 DF PROTO=TCP SPT=36480 DPT=9100 SEQ=3458426271 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE606F80000000001030307) Feb 1 03:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:58:44 localhost podman[107504]: Error: container 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 is not running Feb 1 03:58:44 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Main process exited, code=exited, status=125/n/a Feb 1 03:58:44 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed with result 'exit-code'. Feb 1 03:58:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19457 DF PROTO=TCP SPT=54702 DPT=9101 SEQ=2362081606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE60BB80000000001030307) Feb 1 03:58:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15355 DF PROTO=TCP SPT=36480 DPT=9100 SEQ=3458426271 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE627B90000000001030307) Feb 1 03:58:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13740 DF PROTO=TCP SPT=56724 DPT=9102 SEQ=3525202717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE629B80000000001030307) Feb 1 03:58:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50919 DF PROTO=TCP SPT=50918 DPT=9882 SEQ=3265411094 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE630780000000001030307) Feb 1 03:58:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50920 DF PROTO=TCP SPT=50918 DPT=9882 SEQ=3265411094 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE638790000000001030307) Feb 1 03:58:59 localhost podman[107351]: time="2026-02-01T08:58:59Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Feb 1 03:58:59 localhost systemd[1]: tmp-crun.h58cEK.mount: Deactivated successfully. Feb 1 03:58:59 localhost systemd[1]: session-c11.scope: Deactivated successfully. Feb 1 03:58:59 localhost systemd[1]: libpod-5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.scope: Deactivated successfully. Feb 1 03:58:59 localhost systemd[1]: libpod-5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.scope: Consumed 37.553s CPU time. Feb 1 03:58:59 localhost podman[107351]: 2026-02-01 08:58:59.331528799 +0000 UTC m=+42.103830907 container died 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510) Feb 1 03:58:59 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.timer: Deactivated successfully. Feb 1 03:58:59 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032. Feb 1 03:58:59 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed to open /run/systemd/transient/5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: No such file or directory Feb 1 03:58:59 localhost systemd[1]: var-lib-containers-storage-overlay-44357a368e781a4d23f5b66e408c1324409c46bcb32da06b3ba5ae9fe4a403b2-merged.mount: Deactivated successfully. Feb 1 03:58:59 localhost podman[107351]: 2026-02-01 08:58:59.453954297 +0000 UTC m=+42.226256375 container cleanup 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, release=1766032510, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 03:58:59 localhost podman[107351]: nova_compute Feb 1 03:58:59 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.timer: Failed to open /run/systemd/transient/5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.timer: No such file or directory Feb 1 03:58:59 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed to open /run/systemd/transient/5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: No such file or directory Feb 1 03:58:59 localhost podman[107516]: 2026-02-01 08:58:59.463196751 +0000 UTC m=+0.128965090 container cleanup 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step5, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1) Feb 1 03:58:59 localhost systemd[1]: libpod-conmon-5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.scope: Deactivated successfully. Feb 1 03:58:59 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.timer: Failed to open /run/systemd/transient/5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.timer: No such file or directory Feb 1 03:58:59 localhost systemd[1]: 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: Failed to open /run/systemd/transient/5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032.service: No such file or directory Feb 1 03:58:59 localhost podman[107532]: 2026-02-01 08:58:59.556881547 +0000 UTC m=+0.062463409 container cleanup 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 1 03:58:59 localhost podman[107532]: nova_compute Feb 1 03:58:59 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Feb 1 03:58:59 localhost systemd[1]: Stopped nova_compute container. Feb 1 03:58:59 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.124s CPU time, no IO. Feb 1 03:59:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11524 DF PROTO=TCP SPT=40094 DPT=9101 SEQ=3486358524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6445C0000000001030307) Feb 1 03:59:00 localhost python3.9[107634]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:59:00 localhost systemd[1]: Reloading. Feb 1 03:59:00 localhost systemd-sysv-generator[107660]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:59:00 localhost systemd-rc-local-generator[107657]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:59:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:59:00 localhost systemd[1]: Stopping nova_migration_target container... Feb 1 03:59:00 localhost systemd[1]: libpod-d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.scope: Deactivated successfully. Feb 1 03:59:00 localhost systemd[1]: libpod-d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.scope: Consumed 33.253s CPU time. Feb 1 03:59:00 localhost podman[107674]: 2026-02-01 08:59:00.870830737 +0000 UTC m=+0.070937159 container died d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 1 03:59:00 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.timer: Deactivated successfully. Feb 1 03:59:00 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f. Feb 1 03:59:00 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Failed to open /run/systemd/transient/d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: No such file or directory Feb 1 03:59:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f-userdata-shm.mount: Deactivated successfully. Feb 1 03:59:00 localhost systemd[1]: var-lib-containers-storage-overlay-a3725e54853595614926889eb99d8b5ab03502eb966cd4ee026013d34265250f-merged.mount: Deactivated successfully. Feb 1 03:59:00 localhost podman[107674]: 2026-02-01 08:59:00.919638815 +0000 UTC m=+0.119745187 container cleanup d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git) Feb 1 03:59:00 localhost podman[107674]: nova_migration_target Feb 1 03:59:00 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.timer: Failed to open /run/systemd/transient/d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.timer: No such file or directory Feb 1 03:59:00 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Failed to open /run/systemd/transient/d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: No such file or directory Feb 1 03:59:00 localhost podman[107685]: 2026-02-01 08:59:00.946229371 +0000 UTC m=+0.061373154 container cleanup d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, container_name=nova_migration_target, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container) Feb 1 03:59:00 localhost systemd[1]: libpod-conmon-d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.scope: Deactivated successfully. Feb 1 03:59:01 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.timer: Failed to open /run/systemd/transient/d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.timer: No such file or directory Feb 1 03:59:01 localhost systemd[1]: d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: Failed to open /run/systemd/transient/d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f.service: No such file or directory Feb 1 03:59:01 localhost podman[107700]: nova_migration_target Feb 1 03:59:01 localhost podman[107700]: 2026-02-01 08:59:01.04069336 +0000 UTC m=+0.058280640 container cleanup d444d543fd03387ec989b7eff672c1ba981ade2ded3abbfc55e8385c5544cc1f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=nova_migration_target) Feb 1 03:59:01 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Feb 1 03:59:01 localhost systemd[1]: Stopped nova_migration_target container. Feb 1 03:59:01 localhost python3.9[107803]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 03:59:01 localhost systemd[1]: Reloading. Feb 1 03:59:01 localhost systemd-sysv-generator[107834]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 03:59:01 localhost systemd-rc-local-generator[107831]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 03:59:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 03:59:02 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Feb 1 03:59:02 localhost systemd[1]: libpod-ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a.scope: Deactivated successfully. Feb 1 03:59:02 localhost podman[107844]: 2026-02-01 08:59:02.310082443 +0000 UTC m=+0.089086255 container died ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:59:02 localhost podman[107844]: 2026-02-01 08:59:02.359148079 +0000 UTC m=+0.138151861 container cleanup ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}) Feb 1 03:59:02 localhost podman[107844]: nova_virtlogd_wrapper Feb 1 03:59:02 localhost podman[107857]: 2026-02-01 08:59:02.40120566 +0000 UTC m=+0.079163620 container cleanup ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtlogd_wrapper, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step3, io.openshift.expose-services=, release=1766032510) Feb 1 03:59:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11526 DF PROTO=TCP SPT=40094 DPT=9101 SEQ=3486358524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE650780000000001030307) Feb 1 03:59:03 localhost systemd[1]: tmp-crun.tYaPNn.mount: Deactivated successfully. Feb 1 03:59:03 localhost systemd[1]: var-lib-containers-storage-overlay-4f14b52d41436b01b5b14834ae0d813a4e37a5cf231afa99e1b2758c7e4772f1-merged.mount: Deactivated successfully. Feb 1 03:59:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a-userdata-shm.mount: Deactivated successfully. Feb 1 03:59:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:59:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:59:04 localhost systemd[1]: tmp-crun.MRvJr8.mount: Deactivated successfully. Feb 1 03:59:04 localhost podman[107874]: 2026-02-01 08:59:04.995395027 +0000 UTC m=+0.105261283 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, distribution-scope=public, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 1 03:59:05 localhost podman[107874]: 2026-02-01 08:59:05.017027711 +0000 UTC m=+0.126893967 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510) Feb 1 03:59:05 localhost podman[107874]: unhealthy Feb 1 03:59:05 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:59:05 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:59:05 localhost podman[107875]: 2026-02-01 08:59:05.086820363 +0000 UTC m=+0.191920892 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 03:59:05 localhost podman[107875]: 2026-02-01 08:59:05.124107677 +0000 UTC m=+0.229208196 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc.) Feb 1 03:59:05 localhost podman[107875]: unhealthy Feb 1 03:59:05 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:59:05 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:59:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11527 DF PROTO=TCP SPT=40094 DPT=9101 SEQ=3486358524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE660380000000001030307) Feb 1 03:59:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50922 DF PROTO=TCP SPT=50918 DPT=9882 SEQ=3265411094 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE667B80000000001030307) Feb 1 03:59:09 localhost systemd[1]: Stopping User Manager for UID 0... Feb 1 03:59:09 localhost systemd[83465]: Activating special unit Exit the Session... Feb 1 03:59:09 localhost systemd[83465]: Removed slice User Background Tasks Slice. Feb 1 03:59:09 localhost systemd[83465]: Stopped target Main User Target. Feb 1 03:59:09 localhost systemd[83465]: Stopped target Basic System. Feb 1 03:59:09 localhost systemd[83465]: Stopped target Paths. Feb 1 03:59:09 localhost systemd[83465]: Stopped target Sockets. Feb 1 03:59:09 localhost systemd[83465]: Stopped target Timers. Feb 1 03:59:09 localhost systemd[83465]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 03:59:09 localhost systemd[83465]: Closed D-Bus User Message Bus Socket. Feb 1 03:59:09 localhost systemd[83465]: Stopped Create User's Volatile Files and Directories. Feb 1 03:59:09 localhost systemd[83465]: Removed slice User Application Slice. Feb 1 03:59:09 localhost systemd[83465]: Reached target Shutdown. Feb 1 03:59:09 localhost systemd[83465]: Finished Exit the Session. Feb 1 03:59:09 localhost systemd[83465]: Reached target Exit the Session. Feb 1 03:59:09 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 1 03:59:09 localhost systemd[1]: Stopped User Manager for UID 0. Feb 1 03:59:09 localhost systemd[1]: user@0.service: Consumed 4.721s CPU time, no IO. Feb 1 03:59:09 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 1 03:59:09 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 1 03:59:09 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 1 03:59:09 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 1 03:59:09 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 1 03:59:09 localhost systemd[1]: user-0.slice: Consumed 5.741s CPU time. Feb 1 03:59:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2161 DF PROTO=TCP SPT=57440 DPT=9100 SEQ=4069947270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE67C380000000001030307) Feb 1 03:59:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11528 DF PROTO=TCP SPT=40094 DPT=9101 SEQ=3486358524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE67FB80000000001030307) Feb 1 03:59:18 localhost sshd[107915]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:59:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2162 DF PROTO=TCP SPT=57440 DPT=9100 SEQ=4069947270 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE69BB80000000001030307) Feb 1 03:59:22 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 03:59:22 localhost recover_tripleo_nova_virtqemud[107948]: 61284 Feb 1 03:59:22 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 03:59:22 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 03:59:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51791 DF PROTO=TCP SPT=47272 DPT=9102 SEQ=4133761156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE69DB80000000001030307) Feb 1 03:59:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18257 DF PROTO=TCP SPT=54046 DPT=9882 SEQ=420790926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6A5B80000000001030307) Feb 1 03:59:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18258 DF PROTO=TCP SPT=54046 DPT=9882 SEQ=420790926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6ADB80000000001030307) Feb 1 03:59:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64948 DF PROTO=TCP SPT=57146 DPT=9101 SEQ=2746914536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6B98A0000000001030307) Feb 1 03:59:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64950 DF PROTO=TCP SPT=57146 DPT=9101 SEQ=2746914536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6C5790000000001030307) Feb 1 03:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 03:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 03:59:35 localhost podman[107997]: 2026-02-01 08:59:35.480026462 +0000 UTC m=+0.082001598 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64) Feb 1 03:59:35 localhost podman[107996]: 2026-02-01 08:59:35.536154584 +0000 UTC m=+0.138210342 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 03:59:35 localhost podman[107997]: 2026-02-01 08:59:35.552123015 +0000 UTC m=+0.154098151 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vcs-type=git, version=17.1.13, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 1 03:59:35 localhost podman[107997]: unhealthy Feb 1 03:59:35 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:59:35 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 03:59:35 localhost podman[107996]: 2026-02-01 08:59:35.580567178 +0000 UTC m=+0.182622966 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 1 03:59:35 localhost podman[107996]: unhealthy Feb 1 03:59:35 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 03:59:35 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 03:59:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64951 DF PROTO=TCP SPT=57146 DPT=9101 SEQ=2746914536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6D5380000000001030307) Feb 1 03:59:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18260 DF PROTO=TCP SPT=54046 DPT=9882 SEQ=420790926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6DDB90000000001030307) Feb 1 03:59:40 localhost sshd[108034]: main: sshd: ssh-rsa algorithm is disabled Feb 1 03:59:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65173 DF PROTO=TCP SPT=51320 DPT=9100 SEQ=1682436276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6F1390000000001030307) Feb 1 03:59:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64952 DF PROTO=TCP SPT=57146 DPT=9101 SEQ=2746914536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE6F5B80000000001030307) Feb 1 03:59:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65174 DF PROTO=TCP SPT=51320 DPT=9100 SEQ=1682436276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE711B90000000001030307) Feb 1 03:59:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44071 DF PROTO=TCP SPT=46358 DPT=9102 SEQ=2394670659 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE713B80000000001030307) Feb 1 03:59:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13194 DF PROTO=TCP SPT=43034 DPT=9882 SEQ=429964610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE71AF80000000001030307) Feb 1 03:59:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13195 DF PROTO=TCP SPT=43034 DPT=9882 SEQ=429964610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE722F80000000001030307) Feb 1 04:00:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52467 DF PROTO=TCP SPT=35510 DPT=9101 SEQ=3743522462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE72EBA0000000001030307) Feb 1 04:00:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52469 DF PROTO=TCP SPT=35510 DPT=9101 SEQ=3743522462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE73AB90000000001030307) Feb 1 04:00:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 04:00:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 04:00:05 localhost systemd[1]: tmp-crun.zFkB2j.mount: Deactivated successfully. Feb 1 04:00:05 localhost podman[108036]: 2026-02-01 09:00:05.749336763 +0000 UTC m=+0.103296582 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, architecture=x86_64, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 1 04:00:05 localhost podman[108037]: 2026-02-01 09:00:05.787907887 +0000 UTC m=+0.134758027 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 1 04:00:05 localhost podman[108036]: 2026-02-01 09:00:05.79453853 +0000 UTC m=+0.148498319 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:00:05 localhost podman[108036]: unhealthy Feb 1 04:00:05 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:00:05 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 04:00:05 localhost podman[108037]: 2026-02-01 09:00:05.833562768 +0000 UTC m=+0.180412918 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510) Feb 1 04:00:05 localhost podman[108037]: unhealthy Feb 1 04:00:05 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:00:05 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 04:00:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52470 DF PROTO=TCP SPT=35510 DPT=9101 SEQ=3743522462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE74A780000000001030307) Feb 1 04:00:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65175 DF PROTO=TCP SPT=51320 DPT=9100 SEQ=1682436276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE751B80000000001030307) Feb 1 04:00:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19757 DF PROTO=TCP SPT=37414 DPT=9100 SEQ=913834871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE766780000000001030307) Feb 1 04:00:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52471 DF PROTO=TCP SPT=35510 DPT=9101 SEQ=3743522462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE76BB80000000001030307) Feb 1 04:00:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19758 DF PROTO=TCP SPT=37414 DPT=9100 SEQ=913834871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE787B80000000001030307) Feb 1 04:00:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55781 DF PROTO=TCP SPT=38124 DPT=9102 SEQ=2334530589 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE787B80000000001030307) Feb 1 04:00:24 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 1 04:00:24 localhost recover_tripleo_nova_virtqemud[108093]: 61284 Feb 1 04:00:24 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 1 04:00:24 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 1 04:00:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37786 DF PROTO=TCP SPT=49534 DPT=9882 SEQ=3262502607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE790380000000001030307) Feb 1 04:00:26 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Feb 1 04:00:26 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 60509 (conmon) with signal SIGKILL. Feb 1 04:00:26 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Feb 1 04:00:26 localhost systemd[1]: libpod-conmon-ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a.scope: Deactivated successfully. Feb 1 04:00:26 localhost podman[108168]: error opening file `/run/crun/ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a/status`: No such file or directory Feb 1 04:00:26 localhost podman[108155]: 2026-02-01 09:00:26.472920657 +0000 UTC m=+0.066132801 container cleanup ad7ecee7b1ae221515771edebf213b3820a6ee71686d78874db342d71301699a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, architecture=x86_64) Feb 1 04:00:26 localhost podman[108155]: nova_virtlogd_wrapper Feb 1 04:00:26 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Feb 1 04:00:26 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Feb 1 04:00:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37787 DF PROTO=TCP SPT=49534 DPT=9882 SEQ=3262502607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE798380000000001030307) Feb 1 04:00:27 localhost python3.9[108261]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:00:27 localhost systemd[1]: Reloading. Feb 1 04:00:27 localhost systemd-rc-local-generator[108285]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:00:27 localhost systemd-sysv-generator[108288]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:00:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:00:27 localhost systemd[1]: Stopping nova_virtnodedevd container... Feb 1 04:00:27 localhost systemd[1]: libpod-8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e.scope: Deactivated successfully. Feb 1 04:00:27 localhost systemd[1]: libpod-8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e.scope: Consumed 1.501s CPU time. Feb 1 04:00:27 localhost podman[108302]: 2026-02-01 09:00:27.728227918 +0000 UTC m=+0.065867814 container died 8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, container_name=nova_virtnodedevd, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 1 04:00:27 localhost podman[108302]: 2026-02-01 09:00:27.768826913 +0000 UTC m=+0.106466789 container cleanup 8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtnodedevd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 04:00:27 localhost podman[108302]: nova_virtnodedevd Feb 1 04:00:27 localhost podman[108316]: 2026-02-01 09:00:27.805663844 +0000 UTC m=+0.060497628 container cleanup 8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, container_name=nova_virtnodedevd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container) Feb 1 04:00:27 localhost systemd[1]: libpod-conmon-8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e.scope: Deactivated successfully. Feb 1 04:00:27 localhost podman[108346]: error opening file `/run/crun/8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e/status`: No such file or directory Feb 1 04:00:27 localhost podman[108334]: 2026-02-01 09:00:27.881326326 +0000 UTC m=+0.043815445 container cleanup 8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 04:00:27 localhost podman[108334]: nova_virtnodedevd Feb 1 04:00:27 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Feb 1 04:00:27 localhost systemd[1]: Stopped nova_virtnodedevd container. Feb 1 04:00:28 localhost python3.9[108439]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:00:28 localhost systemd[1]: var-lib-containers-storage-overlay-4a7031e45817e57ce4e097b7c048a8b0b70a6545a5c4fd7f4d0095ffa431700d-merged.mount: Deactivated successfully. Feb 1 04:00:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b67b4fb9f17e6ef820cb91bb026d1f221f3a32e0b77124b44d6744fccb85e1e-userdata-shm.mount: Deactivated successfully. Feb 1 04:00:28 localhost systemd[1]: Reloading. Feb 1 04:00:28 localhost systemd-rc-local-generator[108467]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:00:28 localhost systemd-sysv-generator[108473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:00:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:00:29 localhost systemd[1]: Stopping nova_virtproxyd container... Feb 1 04:00:29 localhost systemd[1]: libpod-aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0.scope: Deactivated successfully. Feb 1 04:00:29 localhost podman[108480]: 2026-02-01 09:00:29.18460552 +0000 UTC m=+0.091306774 container died aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, release=1766032510, container_name=nova_virtproxyd) Feb 1 04:00:29 localhost podman[108480]: 2026-02-01 09:00:29.220028897 +0000 UTC m=+0.126730141 container cleanup aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtproxyd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 1 04:00:29 localhost podman[108480]: nova_virtproxyd Feb 1 04:00:29 localhost podman[108495]: 2026-02-01 09:00:29.257120165 +0000 UTC m=+0.061738616 container cleanup aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, container_name=nova_virtproxyd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:00:29 localhost systemd[1]: libpod-conmon-aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0.scope: Deactivated successfully. Feb 1 04:00:29 localhost podman[108525]: error opening file `/run/crun/aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0/status`: No such file or directory Feb 1 04:00:29 localhost podman[108513]: 2026-02-01 09:00:29.358191407 +0000 UTC m=+0.068312477 container cleanup aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, architecture=x86_64, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 04:00:29 localhost podman[108513]: nova_virtproxyd Feb 1 04:00:29 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Feb 1 04:00:29 localhost systemd[1]: Stopped nova_virtproxyd container. Feb 1 04:00:29 localhost systemd[1]: var-lib-containers-storage-overlay-a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb-merged.mount: Deactivated successfully. Feb 1 04:00:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aed29c613f18ca2e23c54c95855014e274000119505ee27192b22c8acb33a8a0-userdata-shm.mount: Deactivated successfully. Feb 1 04:00:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38664 DF PROTO=TCP SPT=41044 DPT=9101 SEQ=2303781836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7A3EA0000000001030307) Feb 1 04:00:30 localhost python3.9[108618]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:00:31 localhost systemd[1]: Reloading. Feb 1 04:00:31 localhost systemd-sysv-generator[108650]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:00:31 localhost systemd-rc-local-generator[108645]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:00:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:00:31 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Feb 1 04:00:31 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Feb 1 04:00:31 localhost systemd[1]: Stopping nova_virtqemud container... Feb 1 04:00:32 localhost systemd[1]: libpod-7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd.scope: Deactivated successfully. Feb 1 04:00:32 localhost systemd[1]: libpod-7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd.scope: Consumed 2.842s CPU time. Feb 1 04:00:32 localhost podman[108659]: 2026-02-01 09:00:32.004228215 +0000 UTC m=+0.056043851 container died 7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13) Feb 1 04:00:32 localhost systemd[1]: tmp-crun.GrHze1.mount: Deactivated successfully. Feb 1 04:00:32 localhost podman[108659]: 2026-02-01 09:00:32.036726093 +0000 UTC m=+0.088541759 container cleanup 7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt) Feb 1 04:00:32 localhost podman[108659]: nova_virtqemud Feb 1 04:00:32 localhost podman[108674]: 2026-02-01 09:00:32.063723421 +0000 UTC m=+0.048105577 container cleanup 7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtqemud, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64) Feb 1 04:00:32 localhost systemd[1]: var-lib-containers-storage-overlay-292f5cf66e53af225cae7d20bbb4bd0aa8c2510f6727c4157b6c26f59d4ccd92-merged.mount: Deactivated successfully. Feb 1 04:00:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd-userdata-shm.mount: Deactivated successfully. Feb 1 04:00:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46694 DF PROTO=TCP SPT=51656 DPT=9105 SEQ=530789754 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7AFB80000000001030307) Feb 1 04:00:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 04:00:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 04:00:36 localhost podman[108692]: 2026-02-01 09:00:36.235781238 +0000 UTC m=+0.091621423 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, release=1766032510, config_id=tripleo_step4, container_name=ovn_controller, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc.) Feb 1 04:00:36 localhost podman[108691]: 2026-02-01 09:00:36.271909497 +0000 UTC m=+0.131105035 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 04:00:36 localhost podman[108692]: 2026-02-01 09:00:36.2844 +0000 UTC m=+0.140240145 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git) Feb 1 04:00:36 localhost podman[108692]: unhealthy Feb 1 04:00:36 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:00:36 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 04:00:36 localhost podman[108691]: 2026-02-01 09:00:36.314465044 +0000 UTC m=+0.173660562 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:00:36 localhost podman[108691]: unhealthy Feb 1 04:00:36 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:00:36 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 04:00:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38667 DF PROTO=TCP SPT=41044 DPT=9101 SEQ=2303781836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7BFB90000000001030307) Feb 1 04:00:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37789 DF PROTO=TCP SPT=49534 DPT=9882 SEQ=3262502607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7C7B80000000001030307) Feb 1 04:00:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2882 DF PROTO=TCP SPT=43902 DPT=9100 SEQ=3110840456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7DBB90000000001030307) Feb 1 04:00:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38668 DF PROTO=TCP SPT=41044 DPT=9101 SEQ=2303781836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7DFB90000000001030307) Feb 1 04:00:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:00:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 5701 writes, 25K keys, 5701 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5701 writes, 740 syncs, 7.70 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:00:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:00:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 4896 writes, 22K keys, 4896 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4896 writes, 685 syncs, 7.15 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:00:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2883 DF PROTO=TCP SPT=43902 DPT=9100 SEQ=3110840456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7FBB80000000001030307) Feb 1 04:00:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34526 DF PROTO=TCP SPT=58032 DPT=9102 SEQ=1460968884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE7FDB80000000001030307) Feb 1 04:00:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44410 DF PROTO=TCP SPT=36304 DPT=9882 SEQ=1462530256 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE805380000000001030307) Feb 1 04:00:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44411 DF PROTO=TCP SPT=36304 DPT=9882 SEQ=1462530256 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE80D380000000001030307) Feb 1 04:01:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19659 DF PROTO=TCP SPT=33678 DPT=9101 SEQ=2197233045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8191A0000000001030307) Feb 1 04:01:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19661 DF PROTO=TCP SPT=33678 DPT=9101 SEQ=2197233045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE825380000000001030307) Feb 1 04:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 04:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 04:01:06 localhost podman[108741]: 2026-02-01 09:01:06.473244509 +0000 UTC m=+0.075952603 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:36:40Z) Feb 1 04:01:06 localhost podman[108741]: 2026-02-01 09:01:06.487015521 +0000 UTC m=+0.089723655 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, distribution-scope=public, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5) Feb 1 04:01:06 localhost podman[108741]: unhealthy Feb 1 04:01:06 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:01:06 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 04:01:06 localhost podman[108740]: 2026-02-01 09:01:06.528266888 +0000 UTC m=+0.136458270 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git) Feb 1 04:01:06 localhost podman[108740]: 2026-02-01 09:01:06.565728037 +0000 UTC m=+0.173919409 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:01:06 localhost podman[108740]: unhealthy Feb 1 04:01:06 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:01:06 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 04:01:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19662 DF PROTO=TCP SPT=33678 DPT=9101 SEQ=2197233045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE834F80000000001030307) Feb 1 04:01:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34527 DF PROTO=TCP SPT=58032 DPT=9102 SEQ=1460968884 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE83DB80000000001030307) Feb 1 04:01:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48918 DF PROTO=TCP SPT=47736 DPT=9100 SEQ=2516915119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE850F80000000001030307) Feb 1 04:01:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19663 DF PROTO=TCP SPT=33678 DPT=9101 SEQ=2197233045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE855B80000000001030307) Feb 1 04:01:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48919 DF PROTO=TCP SPT=47736 DPT=9100 SEQ=2516915119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE871B80000000001030307) Feb 1 04:01:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55749 DF PROTO=TCP SPT=42758 DPT=9102 SEQ=4068038442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE873B90000000001030307) Feb 1 04:01:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65124 DF PROTO=TCP SPT=49506 DPT=9882 SEQ=4225103039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE87A780000000001030307) Feb 1 04:01:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65125 DF PROTO=TCP SPT=49506 DPT=9882 SEQ=4225103039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE882780000000001030307) Feb 1 04:01:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29254 DF PROTO=TCP SPT=42290 DPT=9101 SEQ=1282570729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE88E490000000001030307) Feb 1 04:01:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29256 DF PROTO=TCP SPT=42290 DPT=9101 SEQ=1282570729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE89A380000000001030307) Feb 1 04:01:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 04:01:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 04:01:36 localhost podman[108856]: 2026-02-01 09:01:36.721271902 +0000 UTC m=+0.073907210 container health_status f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller) Feb 1 04:01:36 localhost podman[108855]: 2026-02-01 09:01:36.702499946 +0000 UTC m=+0.057565928 container health_status f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 04:01:36 localhost podman[108856]: 2026-02-01 09:01:36.757771072 +0000 UTC m=+0.110406370 container exec_died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 1 04:01:36 localhost podman[108856]: unhealthy Feb 1 04:01:36 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:01:36 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed with result 'exit-code'. Feb 1 04:01:36 localhost podman[108855]: 2026-02-01 09:01:36.783365548 +0000 UTC m=+0.138431530 container exec_died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team) Feb 1 04:01:36 localhost podman[108855]: unhealthy Feb 1 04:01:36 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:01:36 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed with result 'exit-code'. Feb 1 04:01:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29257 DF PROTO=TCP SPT=42290 DPT=9101 SEQ=1282570729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8A9F80000000001030307) Feb 1 04:01:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65127 DF PROTO=TCP SPT=49506 DPT=9882 SEQ=4225103039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8B1B80000000001030307) Feb 1 04:01:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49807 DF PROTO=TCP SPT=46168 DPT=9100 SEQ=1043996219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8C5F80000000001030307) Feb 1 04:01:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29258 DF PROTO=TCP SPT=42290 DPT=9101 SEQ=1282570729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8C9B80000000001030307) Feb 1 04:01:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49808 DF PROTO=TCP SPT=46168 DPT=9100 SEQ=1043996219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8E5B90000000001030307) Feb 1 04:01:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20269 DF PROTO=TCP SPT=39728 DPT=9102 SEQ=3043040390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8E7B80000000001030307) Feb 1 04:01:54 localhost sshd[108896]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:01:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9860 DF PROTO=TCP SPT=49296 DPT=9882 SEQ=2004361796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8EFB90000000001030307) Feb 1 04:01:56 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Feb 1 04:01:56 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 61280 (conmon) with signal SIGKILL. Feb 1 04:01:56 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Feb 1 04:01:56 localhost systemd[1]: libpod-conmon-7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd.scope: Deactivated successfully. Feb 1 04:01:56 localhost podman[108909]: error opening file `/run/crun/7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd/status`: No such file or directory Feb 1 04:01:56 localhost podman[108898]: 2026-02-01 09:01:56.214646262 +0000 UTC m=+0.077438467 container cleanup 7c7e08a4cb85c60fa0204fa785153b1b301e4e79b4349e5cba6808544b1889bd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=nova_virtqemud, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}) Feb 1 04:01:56 localhost podman[108898]: nova_virtqemud Feb 1 04:01:56 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Feb 1 04:01:56 localhost systemd[1]: Stopped nova_virtqemud container. Feb 1 04:01:56 localhost sshd[109003]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:01:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9861 DF PROTO=TCP SPT=49296 DPT=9882 SEQ=2004361796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE8F7B80000000001030307) Feb 1 04:01:57 localhost python3.9[109002]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:01:57 localhost systemd[1]: Reloading. Feb 1 04:01:57 localhost systemd-rc-local-generator[109027]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:01:57 localhost systemd-sysv-generator[109030]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:01:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:01:58 localhost python3.9[109134]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:01:58 localhost systemd[1]: Reloading. Feb 1 04:01:58 localhost systemd-rc-local-generator[109160]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:01:58 localhost systemd-sysv-generator[109164]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:01:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:01:58 localhost systemd[1]: Stopping nova_virtsecretd container... Feb 1 04:01:58 localhost systemd[1]: libpod-8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7.scope: Deactivated successfully. Feb 1 04:01:58 localhost podman[109175]: 2026-02-01 09:01:58.598520953 +0000 UTC m=+0.077118698 container died 8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, tcib_managed=true, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2026-01-12T23:31:49Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, version=17.1.13, container_name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 1 04:01:58 localhost podman[109175]: 2026-02-01 09:01:58.63652739 +0000 UTC m=+0.115125125 container cleanup 8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5) Feb 1 04:01:58 localhost podman[109175]: nova_virtsecretd Feb 1 04:01:58 localhost podman[109188]: 2026-02-01 09:01:58.668538752 +0000 UTC m=+0.060072755 container cleanup 8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=nova_virtsecretd, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}) Feb 1 04:01:58 localhost systemd[1]: libpod-conmon-8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7.scope: Deactivated successfully. Feb 1 04:01:58 localhost podman[109218]: error opening file `/run/crun/8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7/status`: No such file or directory Feb 1 04:01:58 localhost podman[109205]: 2026-02-01 09:01:58.766528209 +0000 UTC m=+0.065621624 container cleanup 8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, container_name=nova_virtsecretd, release=1766032510, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 04:01:58 localhost podman[109205]: nova_virtsecretd Feb 1 04:01:58 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Feb 1 04:01:58 localhost systemd[1]: Stopped nova_virtsecretd container. Feb 1 04:01:58 localhost systemd[1]: tripleo_nova_virtsecretd.service: Consumed 1.256s CPU time, no IO. Feb 1 04:01:59 localhost python3.9[109312]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:01:59 localhost systemd[1]: tmp-crun.fWFI2F.mount: Deactivated successfully. Feb 1 04:01:59 localhost systemd[1]: var-lib-containers-storage-overlay-d138b60dde3361f4738c664e6c7084a0b0d6dcc402986826212a9583a9dc448e-merged.mount: Deactivated successfully. Feb 1 04:01:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f3090553ffea08b0b7f7cc3e241f015acc18cd1db085327e676d2cad7277cd7-userdata-shm.mount: Deactivated successfully. Feb 1 04:02:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56703 DF PROTO=TCP SPT=37098 DPT=9101 SEQ=3947447482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9037A0000000001030307) Feb 1 04:02:00 localhost systemd[1]: Reloading. Feb 1 04:02:00 localhost systemd-rc-local-generator[109336]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:02:00 localhost systemd-sysv-generator[109342]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:02:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:02:00 localhost systemd[1]: Stopping nova_virtstoraged container... Feb 1 04:02:00 localhost systemd[1]: tmp-crun.Ww1RUC.mount: Deactivated successfully. Feb 1 04:02:01 localhost systemd[1]: libpod-4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3.scope: Deactivated successfully. Feb 1 04:02:01 localhost podman[109352]: 2026-02-01 09:02:01.005358298 +0000 UTC m=+0.062694995 container died 4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_virtstoraged, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step3) Feb 1 04:02:01 localhost podman[109352]: 2026-02-01 09:02:01.041140446 +0000 UTC m=+0.098477123 container cleanup 4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtstoraged, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}) Feb 1 04:02:01 localhost podman[109352]: nova_virtstoraged Feb 1 04:02:01 localhost podman[109366]: 2026-02-01 09:02:01.081665631 +0000 UTC m=+0.067850915 container cleanup 4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, container_name=nova_virtstoraged, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, release=1766032510) Feb 1 04:02:01 localhost systemd[1]: libpod-conmon-4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3.scope: Deactivated successfully. Feb 1 04:02:01 localhost podman[109395]: error opening file `/run/crun/4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3/status`: No such file or directory Feb 1 04:02:01 localhost podman[109383]: 2026-02-01 09:02:01.153978639 +0000 UTC m=+0.044009631 container cleanup 4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '9ec539c069b98a16ced7663e9b12641d'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2026-01-12T23:31:49Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510) Feb 1 04:02:01 localhost podman[109383]: nova_virtstoraged Feb 1 04:02:01 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Feb 1 04:02:01 localhost systemd[1]: Stopped nova_virtstoraged container. Feb 1 04:02:01 localhost python3.9[109488]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:02:01 localhost systemd[1]: var-lib-containers-storage-overlay-6311ed359cf9dfbe11bbece1ce079633bae3ab687b3ceb7b053c82fb81d4d150-merged.mount: Deactivated successfully. Feb 1 04:02:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4427444ac658601ae3163db39dadeff2e8ae4ea24c61f01fc2bad2d4ac63e2c3-userdata-shm.mount: Deactivated successfully. Feb 1 04:02:02 localhost systemd[1]: Reloading. Feb 1 04:02:02 localhost systemd-rc-local-generator[109515]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:02:02 localhost systemd-sysv-generator[109518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:02:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:02:02 localhost systemd[1]: Stopping ovn_controller container... Feb 1 04:02:02 localhost systemd[1]: libpod-f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.scope: Deactivated successfully. Feb 1 04:02:02 localhost systemd[1]: libpod-f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.scope: Consumed 2.598s CPU time. Feb 1 04:02:02 localhost podman[109529]: 2026-02-01 09:02:02.385970164 +0000 UTC m=+0.080279695 container died f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, url=https://www.redhat.com, distribution-scope=public) Feb 1 04:02:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.timer: Deactivated successfully. Feb 1 04:02:02 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446. Feb 1 04:02:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed to open /run/systemd/transient/f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: No such file or directory Feb 1 04:02:02 localhost systemd[1]: tmp-crun.7CvJ30.mount: Deactivated successfully. Feb 1 04:02:02 localhost podman[109529]: 2026-02-01 09:02:02.444741088 +0000 UTC m=+0.139050609 container cleanup f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, release=1766032510, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Feb 1 04:02:02 localhost podman[109529]: ovn_controller Feb 1 04:02:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.timer: Failed to open /run/systemd/transient/f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.timer: No such file or directory Feb 1 04:02:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed to open /run/systemd/transient/f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: No such file or directory Feb 1 04:02:02 localhost podman[109541]: 2026-02-01 09:02:02.466553808 +0000 UTC m=+0.068859545 container cleanup f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 1 04:02:02 localhost systemd[1]: libpod-conmon-f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.scope: Deactivated successfully. Feb 1 04:02:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.timer: Failed to open /run/systemd/transient/f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.timer: No such file or directory Feb 1 04:02:02 localhost systemd[1]: f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: Failed to open /run/systemd/transient/f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446.service: No such file or directory Feb 1 04:02:02 localhost podman[109555]: 2026-02-01 09:02:02.550418172 +0000 UTC m=+0.053193854 container cleanup f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 1 04:02:02 localhost podman[109555]: ovn_controller Feb 1 04:02:02 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Feb 1 04:02:02 localhost systemd[1]: Stopped ovn_controller container. Feb 1 04:02:02 localhost systemd[1]: var-lib-containers-storage-overlay-76a18a8a420711bccaa970c77feff6f4a517526e7e59a2e2c9c151c919c25156-merged.mount: Deactivated successfully. Feb 1 04:02:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446-userdata-shm.mount: Deactivated successfully. Feb 1 04:02:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56705 DF PROTO=TCP SPT=37098 DPT=9101 SEQ=3947447482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE90F780000000001030307) Feb 1 04:02:03 localhost python3.9[109658]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:02:03 localhost systemd[1]: Reloading. Feb 1 04:02:03 localhost systemd-rc-local-generator[109686]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:02:03 localhost systemd-sysv-generator[109689]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:02:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:02:03 localhost systemd[1]: Stopping ovn_metadata_agent container... Feb 1 04:02:04 localhost systemd[1]: libpod-f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.scope: Deactivated successfully. Feb 1 04:02:04 localhost systemd[1]: libpod-f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.scope: Consumed 11.833s CPU time. Feb 1 04:02:04 localhost podman[109699]: 2026-02-01 09:02:04.747200681 +0000 UTC m=+1.062686550 container stop f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 1 04:02:04 localhost podman[109699]: 2026-02-01 09:02:04.780202894 +0000 UTC m=+1.095688783 container died f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 1 04:02:04 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.timer: Deactivated successfully. Feb 1 04:02:04 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840. Feb 1 04:02:04 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed to open /run/systemd/transient/f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: No such file or directory Feb 1 04:02:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840-userdata-shm.mount: Deactivated successfully. Feb 1 04:02:04 localhost systemd[1]: var-lib-containers-storage-overlay-055744583dc2f819841e567440534be33df8561cb3313cfcbdf707f0c0d0f4f2-merged.mount: Deactivated successfully. Feb 1 04:02:04 localhost podman[109699]: 2026-02-01 09:02:04.845182688 +0000 UTC m=+1.160668527 container cleanup f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, tcib_managed=true, release=1766032510, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 1 04:02:04 localhost podman[109699]: ovn_metadata_agent Feb 1 04:02:04 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.timer: Failed to open /run/systemd/transient/f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.timer: No such file or directory Feb 1 04:02:04 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed to open /run/systemd/transient/f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: No such file or directory Feb 1 04:02:04 localhost podman[109713]: 2026-02-01 09:02:04.867942827 +0000 UTC m=+0.104107457 container cleanup f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13) Feb 1 04:02:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56706 DF PROTO=TCP SPT=37098 DPT=9101 SEQ=3947447482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE91F380000000001030307) Feb 1 04:02:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20270 DF PROTO=TCP SPT=39728 DPT=9102 SEQ=3043040390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE927B80000000001030307) Feb 1 04:02:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42859 DF PROTO=TCP SPT=36796 DPT=9100 SEQ=1186366697 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE93B380000000001030307) Feb 1 04:02:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56707 DF PROTO=TCP SPT=37098 DPT=9101 SEQ=3947447482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE93FB90000000001030307) Feb 1 04:02:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42860 DF PROTO=TCP SPT=36796 DPT=9100 SEQ=1186366697 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE95BB80000000001030307) Feb 1 04:02:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3753 DF PROTO=TCP SPT=59750 DPT=9102 SEQ=3190982627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE95DB80000000001030307) Feb 1 04:02:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12816 DF PROTO=TCP SPT=36772 DPT=9882 SEQ=3834126557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE964F80000000001030307) Feb 1 04:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12817 DF PROTO=TCP SPT=36772 DPT=9882 SEQ=3834126557 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE96CF80000000001030307) Feb 1 04:02:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18969 DF PROTO=TCP SPT=37340 DPT=9101 SEQ=948677429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE97A700000000001030307) Feb 1 04:02:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64344 DF PROTO=TCP SPT=58412 DPT=9105 SEQ=3765459792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE985B80000000001030307) Feb 1 04:02:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35181 DF PROTO=TCP SPT=42830 DPT=9100 SEQ=1080974322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE994A70000000001030307) Feb 1 04:02:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42861 DF PROTO=TCP SPT=36796 DPT=9100 SEQ=1186366697 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE99BB80000000001030307) Feb 1 04:02:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35184 DF PROTO=TCP SPT=42830 DPT=9100 SEQ=1080974322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9B0780000000001030307) Feb 1 04:02:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18973 DF PROTO=TCP SPT=37340 DPT=9101 SEQ=948677429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9B5B80000000001030307) Feb 1 04:02:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46198 DF PROTO=TCP SPT=49814 DPT=9102 SEQ=352106608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9D1B80000000001030307) Feb 1 04:02:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35185 DF PROTO=TCP SPT=42830 DPT=9100 SEQ=1080974322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9D1B90000000001030307) Feb 1 04:02:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16943 DF PROTO=TCP SPT=34258 DPT=9882 SEQ=3255767736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9D9F80000000001030307) Feb 1 04:02:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16944 DF PROTO=TCP SPT=34258 DPT=9882 SEQ=3255767736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9E1F80000000001030307) Feb 1 04:03:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52720 DF PROTO=TCP SPT=38424 DPT=9101 SEQ=4146700762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9EDDA0000000001030307) Feb 1 04:03:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37869 DF PROTO=TCP SPT=39200 DPT=9105 SEQ=1222885966 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CE9F9B80000000001030307) Feb 1 04:03:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52723 DF PROTO=TCP SPT=38424 DPT=9101 SEQ=4146700762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA09B80000000001030307) Feb 1 04:03:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16946 DF PROTO=TCP SPT=34258 DPT=9882 SEQ=3255767736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA11B80000000001030307) Feb 1 04:03:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6032 DF PROTO=TCP SPT=39708 DPT=9100 SEQ=1208248993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA25C80000000001030307) Feb 1 04:03:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52724 DF PROTO=TCP SPT=38424 DPT=9101 SEQ=4146700762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA29B80000000001030307) Feb 1 04:03:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6033 DF PROTO=TCP SPT=39708 DPT=9100 SEQ=1208248993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA45B80000000001030307) Feb 1 04:03:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56967 DF PROTO=TCP SPT=60344 DPT=9102 SEQ=3314833238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA47B80000000001030307) Feb 1 04:03:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22991 DF PROTO=TCP SPT=39384 DPT=9882 SEQ=1880990385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA4F390000000001030307) Feb 1 04:03:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22992 DF PROTO=TCP SPT=39384 DPT=9882 SEQ=1880990385 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA57390000000001030307) Feb 1 04:03:28 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Feb 1 04:03:28 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 68822 (conmon) with signal SIGKILL. Feb 1 04:03:28 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Feb 1 04:03:28 localhost systemd[1]: libpod-conmon-f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.scope: Deactivated successfully. Feb 1 04:03:28 localhost podman[109820]: error opening file `/run/crun/f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840/status`: No such file or directory Feb 1 04:03:28 localhost systemd[1]: tmp-crun.jXX5pp.mount: Deactivated successfully. Feb 1 04:03:28 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.timer: Failed to open /run/systemd/transient/f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.timer: No such file or directory Feb 1 04:03:28 localhost systemd[1]: f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: Failed to open /run/systemd/transient/f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840.service: No such file or directory Feb 1 04:03:28 localhost podman[109807]: 2026-02-01 09:03:28.985833878 +0000 UTC m=+0.086335044 container cleanup f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 1 04:03:28 localhost podman[109807]: ovn_metadata_agent Feb 1 04:03:28 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Feb 1 04:03:28 localhost systemd[1]: Stopped ovn_metadata_agent container. Feb 1 04:03:29 localhost python3.9[109914]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:03:29 localhost systemd[1]: Reloading. Feb 1 04:03:29 localhost systemd-rc-local-generator[109941]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:03:29 localhost systemd-sysv-generator[109946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:03:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:03:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56291 DF PROTO=TCP SPT=60538 DPT=9101 SEQ=3383009121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA630A0000000001030307) Feb 1 04:03:31 localhost python3.9[110123]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:32 localhost python3.9[110248]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:33 localhost python3.9[110355]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56293 DF PROTO=TCP SPT=60538 DPT=9101 SEQ=3383009121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA6EF80000000001030307) Feb 1 04:03:33 localhost python3.9[110447]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:34 localhost python3.9[110539]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:34 localhost python3.9[110631]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:35 localhost python3.9[110723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:36 localhost python3.9[110815]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:36 localhost python3.9[110907]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56294 DF PROTO=TCP SPT=60538 DPT=9101 SEQ=3383009121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA7EB80000000001030307) Feb 1 04:03:37 localhost python3.9[110999]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:37 localhost python3.9[111091]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:38 localhost python3.9[111183]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:39 localhost python3.9[111275]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56968 DF PROTO=TCP SPT=60344 DPT=9102 SEQ=3314833238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA87B80000000001030307) Feb 1 04:03:39 localhost python3.9[111367]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:40 localhost python3.9[111459]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:40 localhost python3.9[111551]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:41 localhost python3.9[111643]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:41 localhost python3.9[111735]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:42 localhost python3.9[111827]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:43 localhost python3.9[111919]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:43 localhost python3.9[112011]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54172 DF PROTO=TCP SPT=36148 DPT=9100 SEQ=2118010009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA9AB90000000001030307) Feb 1 04:03:44 localhost python3.9[112103]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:45 localhost python3.9[112195]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56295 DF PROTO=TCP SPT=60538 DPT=9101 SEQ=3383009121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEA9FB80000000001030307) Feb 1 04:03:45 localhost python3.9[112287]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:46 localhost python3.9[112379]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:46 localhost python3.9[112471]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:47 localhost python3.9[112563]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:48 localhost python3.9[112655]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:48 localhost python3.9[112747]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:49 localhost python3.9[112839]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:50 localhost python3.9[112931]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:50 localhost python3.9[113023]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:51 localhost python3.9[113115]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:52 localhost python3.9[113207]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:52 localhost python3.9[113299]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54173 DF PROTO=TCP SPT=36148 DPT=9100 SEQ=2118010009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEABBB90000000001030307) Feb 1 04:03:53 localhost python3.9[113391]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60509 DF PROTO=TCP SPT=49614 DPT=9102 SEQ=1091301500 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEABDB80000000001030307) Feb 1 04:03:53 localhost python3.9[113483]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:54 localhost python3.9[113575]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49690 DF PROTO=TCP SPT=58832 DPT=9882 SEQ=1517298920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEAC4780000000001030307) Feb 1 04:03:55 localhost python3.9[113667]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:55 localhost python3.9[113759]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:56 localhost python3.9[113851]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:56 localhost python3.9[113943]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:03:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49691 DF PROTO=TCP SPT=58832 DPT=9882 SEQ=1517298920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEACC780000000001030307) Feb 1 04:03:57 localhost python3.9[114035]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:03:58 localhost python3.9[114127]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 1 04:03:59 localhost python3.9[114219]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:03:59 localhost systemd[1]: Reloading. Feb 1 04:03:59 localhost systemd-rc-local-generator[114242]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:03:59 localhost systemd-sysv-generator[114247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:03:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:04:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27213 DF PROTO=TCP SPT=37344 DPT=9101 SEQ=594120386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEAD83A0000000001030307) Feb 1 04:04:00 localhost python3.9[114347]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:01 localhost python3.9[114440]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:02 localhost python3.9[114533]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27215 DF PROTO=TCP SPT=37344 DPT=9101 SEQ=594120386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEAE4380000000001030307) Feb 1 04:04:03 localhost python3.9[114626]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:04 localhost python3.9[114719]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:04 localhost python3.9[114812]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:05 localhost python3.9[114905]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:05 localhost python3.9[114998]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:06 localhost python3.9[115091]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27216 DF PROTO=TCP SPT=37344 DPT=9101 SEQ=594120386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEAF3F80000000001030307) Feb 1 04:04:07 localhost python3.9[115184]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:07 localhost python3.9[115277]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:08 localhost python3.9[115370]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:09 localhost python3.9[115463]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49693 DF PROTO=TCP SPT=58832 DPT=9882 SEQ=1517298920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEAFBB80000000001030307) Feb 1 04:04:09 localhost python3.9[115556]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:11 localhost python3.9[115650]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:12 localhost sshd[115711]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:04:12 localhost python3.9[115745]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:13 localhost python3.9[115838]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:13 localhost python3.9[115931]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46626 DF PROTO=TCP SPT=52704 DPT=9100 SEQ=3766554904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB0FF80000000001030307) Feb 1 04:04:14 localhost python3.9[116024]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:15 localhost python3.9[116117]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27217 DF PROTO=TCP SPT=37344 DPT=9101 SEQ=594120386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB13B90000000001030307) Feb 1 04:04:15 localhost python3.9[116210]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:16 localhost systemd[1]: session-37.scope: Deactivated successfully. Feb 1 04:04:16 localhost systemd[1]: session-37.scope: Consumed 49.372s CPU time. Feb 1 04:04:16 localhost systemd-logind[759]: Session 37 logged out. Waiting for processes to exit. Feb 1 04:04:16 localhost systemd-logind[759]: Removed session 37. Feb 1 04:04:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46627 DF PROTO=TCP SPT=52704 DPT=9100 SEQ=3766554904 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB2FB80000000001030307) Feb 1 04:04:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48123 DF PROTO=TCP SPT=52916 DPT=9102 SEQ=3333849457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB31B80000000001030307) Feb 1 04:04:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29584 DF PROTO=TCP SPT=60014 DPT=9882 SEQ=72077344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB39B90000000001030307) Feb 1 04:04:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29585 DF PROTO=TCP SPT=60014 DPT=9882 SEQ=72077344 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB41B80000000001030307) Feb 1 04:04:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40948 DF PROTO=TCP SPT=58626 DPT=9101 SEQ=2404194865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB4D6B0000000001030307) Feb 1 04:04:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40950 DF PROTO=TCP SPT=58626 DPT=9101 SEQ=2404194865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB59780000000001030307) Feb 1 04:04:35 localhost sshd[116303]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:04:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40951 DF PROTO=TCP SPT=58626 DPT=9101 SEQ=2404194865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB69390000000001030307) Feb 1 04:04:37 localhost sshd[116305]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:04:37 localhost systemd-logind[759]: New session 38 of user zuul. Feb 1 04:04:37 localhost systemd[1]: Started Session 38 of User zuul. Feb 1 04:04:38 localhost python3.9[116398]: ansible-ansible.legacy.ping Invoked with data=pong Feb 1 04:04:39 localhost python3.9[116502]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:04:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48124 DF PROTO=TCP SPT=52916 DPT=9102 SEQ=3333849457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB71B80000000001030307) Feb 1 04:04:40 localhost python3.9[116594]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:41 localhost python3.9[116687]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:04:42 localhost python3.9[116779]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:04:42 localhost python3.9[116871]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:04:43 localhost python3.9[116944]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936682.4082391-172-19643620764292/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:04:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27735 DF PROTO=TCP SPT=38248 DPT=9100 SEQ=2724672282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB85380000000001030307) Feb 1 04:04:44 localhost python3.9[117036]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:04:45 localhost python3.9[117132]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:04:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40952 DF PROTO=TCP SPT=58626 DPT=9101 SEQ=2404194865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEB89B80000000001030307) Feb 1 04:04:46 localhost python3.9[117224]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:04:46 localhost python3.9[117314]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:04:47 localhost network[117331]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:04:47 localhost network[117332]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:04:47 localhost network[117333]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:04:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:04:52 localhost python3.9[117530]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:04:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27736 DF PROTO=TCP SPT=38248 DPT=9100 SEQ=2724672282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBA5B90000000001030307) Feb 1 04:04:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45312 DF PROTO=TCP SPT=51698 DPT=9102 SEQ=2584165214 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBA7B90000000001030307) Feb 1 04:04:53 localhost python3.9[117620]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:04:54 localhost python3.9[117716]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012# FIXME: perform dnf upgrade for other packages in EDPM ansible#012# here we only ensuring that decontainerized libvirt can start#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:04:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34803 DF PROTO=TCP SPT=40756 DPT=9882 SEQ=2781716672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBAEB80000000001030307) Feb 1 04:04:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34804 DF PROTO=TCP SPT=40756 DPT=9882 SEQ=2781716672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBB6B80000000001030307) Feb 1 04:05:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31907 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=2869498672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBC29A0000000001030307) Feb 1 04:05:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31909 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=2869498672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBCEB90000000001030307) Feb 1 04:05:03 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 1 04:05:03 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 1 04:05:03 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 1 04:05:03 localhost systemd[1]: sshd.service: Consumed 2.569s CPU time. Feb 1 04:05:03 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 1 04:05:03 localhost systemd[1]: Stopping sshd-keygen.target... Feb 1 04:05:03 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:05:03 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:05:03 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:05:03 localhost systemd[1]: Reached target sshd-keygen.target. Feb 1 04:05:03 localhost systemd[1]: Starting OpenSSH server daemon... Feb 1 04:05:03 localhost sshd[117759]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:05:03 localhost systemd[1]: Started OpenSSH server daemon. Feb 1 04:05:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 04:05:04 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 04:05:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 04:05:04 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 04:05:04 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 04:05:04 localhost systemd[1]: run-r007f8c9f749b40e6821dc331c7f3d3a9.service: Deactivated successfully. Feb 1 04:05:04 localhost systemd[1]: run-r2f34015aad1f42c6a1efe54eefd37a53.service: Deactivated successfully. Feb 1 04:05:05 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 1 04:05:05 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 1 04:05:05 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 1 04:05:05 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 1 04:05:05 localhost systemd[1]: Stopping sshd-keygen.target... Feb 1 04:05:05 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:05:05 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:05:05 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:05:05 localhost systemd[1]: Reached target sshd-keygen.target. Feb 1 04:05:05 localhost systemd[1]: Starting OpenSSH server daemon... Feb 1 04:05:05 localhost sshd[118161]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:05:05 localhost systemd[1]: Started OpenSSH server daemon. Feb 1 04:05:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31910 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=2869498672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBDE780000000001030307) Feb 1 04:05:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27737 DF PROTO=TCP SPT=38248 DPT=9100 SEQ=2724672282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBE5B90000000001030307) Feb 1 04:05:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20048 DF PROTO=TCP SPT=48926 DPT=9100 SEQ=1867079092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBFA780000000001030307) Feb 1 04:05:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31911 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=2869498672 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEBFFB80000000001030307) Feb 1 04:05:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20049 DF PROTO=TCP SPT=48926 DPT=9100 SEQ=1867079092 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC1BB80000000001030307) Feb 1 04:05:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43813 DF PROTO=TCP SPT=54728 DPT=9102 SEQ=1154169737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC1BBE0000000001030307) Feb 1 04:05:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45747 DF PROTO=TCP SPT=55890 DPT=9882 SEQ=1563103752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC23F80000000001030307) Feb 1 04:05:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45748 DF PROTO=TCP SPT=55890 DPT=9882 SEQ=1563103752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC2BF80000000001030307) Feb 1 04:05:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55465 DF PROTO=TCP SPT=38862 DPT=9101 SEQ=4055472479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC37CA0000000001030307) Feb 1 04:05:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55467 DF PROTO=TCP SPT=38862 DPT=9101 SEQ=4055472479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC43B90000000001030307) Feb 1 04:05:35 localhost systemd[1]: tmp-crun.Ekq90y.mount: Deactivated successfully. Feb 1 04:05:35 localhost podman[118399]: 2026-02-01 09:05:35.2742073 +0000 UTC m=+0.076463870 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, release=1764794109, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Feb 1 04:05:35 localhost podman[118399]: 2026-02-01 09:05:35.353673071 +0000 UTC m=+0.155929671 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:05:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55468 DF PROTO=TCP SPT=38862 DPT=9101 SEQ=4055472479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC53780000000001030307) Feb 1 04:05:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43814 DF PROTO=TCP SPT=54728 DPT=9102 SEQ=1154169737 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC5BB80000000001030307) Feb 1 04:05:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47938 DF PROTO=TCP SPT=55908 DPT=9100 SEQ=6365235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC6F780000000001030307) Feb 1 04:05:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55469 DF PROTO=TCP SPT=38862 DPT=9101 SEQ=4055472479 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC73B80000000001030307) Feb 1 04:05:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47939 DF PROTO=TCP SPT=55908 DPT=9100 SEQ=6365235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC8FB80000000001030307) Feb 1 04:05:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16904 DF PROTO=TCP SPT=53888 DPT=9102 SEQ=2748047800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC91B80000000001030307) Feb 1 04:05:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31825 DF PROTO=TCP SPT=39526 DPT=9882 SEQ=947318048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEC99380000000001030307) Feb 1 04:05:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31826 DF PROTO=TCP SPT=39526 DPT=9882 SEQ=947318048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CECA1380000000001030307) Feb 1 04:06:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64663 DF PROTO=TCP SPT=41712 DPT=9101 SEQ=1170987183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CECACFA0000000001030307) Feb 1 04:06:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64665 DF PROTO=TCP SPT=41712 DPT=9101 SEQ=1170987183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CECB8F80000000001030307) Feb 1 04:06:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64666 DF PROTO=TCP SPT=41712 DPT=9101 SEQ=1170987183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CECC8B90000000001030307) Feb 1 04:06:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31828 DF PROTO=TCP SPT=39526 DPT=9882 SEQ=947318048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CECD1B80000000001030307) Feb 1 04:06:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3419 DF PROTO=TCP SPT=60788 DPT=9100 SEQ=2323616261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CECE4B80000000001030307) Feb 1 04:06:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64667 DF PROTO=TCP SPT=41712 DPT=9101 SEQ=1170987183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CECE9B80000000001030307) Feb 1 04:06:18 localhost kernel: SELinux: Converting 2755 SID table entries... Feb 1 04:06:18 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 04:06:18 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 04:06:18 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 04:06:18 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 04:06:18 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 04:06:18 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 04:06:18 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 04:06:19 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=17 res=1 Feb 1 04:06:20 localhost python3.9[119038]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:06:20 localhost python3.9[119130]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:06:21 localhost python3.9[119203]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936780.289093-421-268822422191623/.source.fact _original_basename=.fmghfp4q follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:06:22 localhost python3.9[119293]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:06:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3420 DF PROTO=TCP SPT=60788 DPT=9100 SEQ=2323616261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED05B80000000001030307) Feb 1 04:06:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9320 DF PROTO=TCP SPT=47850 DPT=9102 SEQ=4141590421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED07B90000000001030307) Feb 1 04:06:23 localhost python3.9[119391]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:06:24 localhost python3.9[119445]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:06:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24339 DF PROTO=TCP SPT=36138 DPT=9882 SEQ=3184006232 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED0E780000000001030307) Feb 1 04:06:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24340 DF PROTO=TCP SPT=36138 DPT=9882 SEQ=3184006232 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED16780000000001030307) Feb 1 04:06:28 localhost systemd[1]: Reloading. Feb 1 04:06:28 localhost systemd-sysv-generator[119486]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:06:28 localhost systemd-rc-local-generator[119480]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:06:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:06:28 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 04:06:30 localhost python3.9[119584]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:06:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61967 DF PROTO=TCP SPT=44388 DPT=9101 SEQ=1350669348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED222C0000000001030307) Feb 1 04:06:30 localhost sshd[119732]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:06:31 localhost python3.9[119825]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Feb 1 04:06:32 localhost python3.9[119917]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Feb 1 04:06:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61969 DF PROTO=TCP SPT=44388 DPT=9101 SEQ=1350669348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED2E390000000001030307) Feb 1 04:06:34 localhost python3.9[120010]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:06:34 localhost python3.9[120102]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Feb 1 04:06:36 localhost python3.9[120194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:06:37 localhost python3.9[120286]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:06:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61970 DF PROTO=TCP SPT=44388 DPT=9101 SEQ=1350669348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED3DF90000000001030307) Feb 1 04:06:37 localhost python3.9[120389]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769936796.6977513-745-256507034560944/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:06:38 localhost python3.9[120527]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:06:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3421 DF PROTO=TCP SPT=60788 DPT=9100 SEQ=2323616261 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED45B80000000001030307) Feb 1 04:06:40 localhost python3.9[120621]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Feb 1 04:06:41 localhost python3.9[120714]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Feb 1 04:06:42 localhost python3.9[120807]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 1 04:06:42 localhost python3.9[120905]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Feb 1 04:06:43 localhost python3.9[120997]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:06:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1303 DF PROTO=TCP SPT=48784 DPT=9100 SEQ=642066716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED59F90000000001030307) Feb 1 04:06:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61971 DF PROTO=TCP SPT=44388 DPT=9101 SEQ=1350669348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED5DB90000000001030307) Feb 1 04:06:47 localhost python3.9[121091]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:06:52 localhost python3.9[121183]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:06:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1304 DF PROTO=TCP SPT=48784 DPT=9100 SEQ=642066716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED79B80000000001030307) Feb 1 04:06:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60793 DF PROTO=TCP SPT=36140 DPT=9102 SEQ=2466556096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED7BB80000000001030307) Feb 1 04:06:54 localhost python3.9[121256]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769936811.7019932-1018-215712111740943/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 04:06:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53426 DF PROTO=TCP SPT=37432 DPT=9882 SEQ=3178602989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED83780000000001030307) Feb 1 04:06:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53427 DF PROTO=TCP SPT=37432 DPT=9882 SEQ=3178602989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED8B790000000001030307) Feb 1 04:06:58 localhost python3.9[121348]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:06:58 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 1 04:06:58 localhost systemd[1]: Stopped Load Kernel Modules. Feb 1 04:06:58 localhost systemd[1]: Stopping Load Kernel Modules... Feb 1 04:06:58 localhost systemd[1]: Starting Load Kernel Modules... Feb 1 04:06:58 localhost systemd-modules-load[121352]: Module 'msr' is built in Feb 1 04:06:58 localhost systemd[1]: Finished Load Kernel Modules. Feb 1 04:06:59 localhost python3.9[121444]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:06:59 localhost python3.9[121517]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769936818.5631897-1087-225651924376795/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 04:07:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11931 DF PROTO=TCP SPT=58204 DPT=9101 SEQ=16231282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CED975A0000000001030307) Feb 1 04:07:00 localhost python3.9[121609]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:07:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11933 DF PROTO=TCP SPT=58204 DPT=9101 SEQ=16231282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDA3780000000001030307) Feb 1 04:07:05 localhost python3.9[121701]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:07:06 localhost python3.9[121793]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Feb 1 04:07:06 localhost python3.9[121883]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:07:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11934 DF PROTO=TCP SPT=58204 DPT=9101 SEQ=16231282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDB3380000000001030307) Feb 1 04:07:07 localhost python3.9[121975]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:07:08 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 1 04:07:08 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 1 04:07:08 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 1 04:07:08 localhost systemd[1]: tuned.service: Consumed 2.015s CPU time, no IO. Feb 1 04:07:08 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 1 04:07:09 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 1 04:07:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60794 DF PROTO=TCP SPT=36140 DPT=9102 SEQ=2466556096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDBBB80000000001030307) Feb 1 04:07:10 localhost python3.9[122077]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Feb 1 04:07:13 localhost python3.9[122169]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:07:13 localhost systemd[1]: Reloading. Feb 1 04:07:13 localhost systemd-rc-local-generator[122196]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:07:13 localhost systemd-sysv-generator[122200]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:07:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:07:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26821 DF PROTO=TCP SPT=55980 DPT=9100 SEQ=914901476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDCF390000000001030307) Feb 1 04:07:14 localhost sshd[122296]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:07:14 localhost python3.9[122299]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:07:14 localhost systemd[1]: Reloading. Feb 1 04:07:14 localhost systemd-rc-local-generator[122324]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:07:14 localhost systemd-sysv-generator[122329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:07:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:07:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11935 DF PROTO=TCP SPT=58204 DPT=9101 SEQ=16231282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDD3B90000000001030307) Feb 1 04:07:16 localhost python3.9[122430]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:07:16 localhost python3.9[122523]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:07:16 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Feb 1 04:07:17 localhost python3.9[122616]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:07:19 localhost python3.9[122715]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:07:20 localhost python3.9[122808]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:07:20 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 1 04:07:20 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 1 04:07:20 localhost systemd[1]: Stopping Apply Kernel Variables... Feb 1 04:07:20 localhost systemd[1]: Starting Apply Kernel Variables... Feb 1 04:07:20 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 1 04:07:20 localhost systemd[1]: Finished Apply Kernel Variables. Feb 1 04:07:20 localhost systemd[1]: session-38.scope: Deactivated successfully. Feb 1 04:07:20 localhost systemd[1]: session-38.scope: Consumed 2min 662ms CPU time. Feb 1 04:07:20 localhost systemd-logind[759]: Session 38 logged out. Waiting for processes to exit. Feb 1 04:07:20 localhost systemd-logind[759]: Removed session 38. Feb 1 04:07:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26822 DF PROTO=TCP SPT=55980 DPT=9100 SEQ=914901476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDEFB90000000001030307) Feb 1 04:07:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56749 DF PROTO=TCP SPT=40574 DPT=9102 SEQ=217585418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDF1B80000000001030307) Feb 1 04:07:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16189 DF PROTO=TCP SPT=54578 DPT=9882 SEQ=71714546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEDF8B80000000001030307) Feb 1 04:07:25 localhost sshd[122829]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:07:25 localhost systemd-logind[759]: New session 39 of user zuul. Feb 1 04:07:25 localhost systemd[1]: Started Session 39 of User zuul. Feb 1 04:07:26 localhost python3.9[122922]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:07:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16190 DF PROTO=TCP SPT=54578 DPT=9882 SEQ=71714546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE00B80000000001030307) Feb 1 04:07:27 localhost python3.9[123016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:07:29 localhost python3.9[123112]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:07:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28665 DF PROTO=TCP SPT=36410 DPT=9101 SEQ=719771960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE0C8A0000000001030307) Feb 1 04:07:30 localhost python3.9[123203]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:07:31 localhost python3.9[123299]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:07:32 localhost python3.9[123353]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:07:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28667 DF PROTO=TCP SPT=36410 DPT=9101 SEQ=719771960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE18780000000001030307) Feb 1 04:07:36 localhost python3.9[123447]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:07:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28668 DF PROTO=TCP SPT=36410 DPT=9101 SEQ=719771960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE28380000000001030307) Feb 1 04:07:37 localhost python3.9[123602]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:07:38 localhost python3.9[123694]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:07:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26823 DF PROTO=TCP SPT=55980 DPT=9100 SEQ=914901476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE2FB90000000001030307) Feb 1 04:07:39 localhost python3.9[123845]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:07:39 localhost python3.9[123907]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:07:40 localhost python3.9[124014]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:07:41 localhost python3.9[124087]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769936860.1318429-318-114337589130682/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 04:07:42 localhost python3.9[124179]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:07:42 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 1 04:07:42 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:07:42 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:07:42 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:07:42 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:07:42 localhost python3.9[124272]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:07:43 localhost python3.9[124364]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:07:44 localhost python3.9[124456]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:07:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41093 DF PROTO=TCP SPT=36432 DPT=9100 SEQ=643976914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE44380000000001030307) Feb 1 04:07:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28669 DF PROTO=TCP SPT=36410 DPT=9101 SEQ=719771960 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE47B90000000001030307) Feb 1 04:07:45 localhost python3.9[124546]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:07:46 localhost python3.9[124640]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:07:50 localhost python3.9[124734]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:07:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41094 DF PROTO=TCP SPT=36432 DPT=9100 SEQ=643976914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE63B90000000001030307) Feb 1 04:07:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25012 DF PROTO=TCP SPT=41984 DPT=9102 SEQ=571484889 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE65B80000000001030307) Feb 1 04:07:54 localhost python3.9[124828]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:07:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43598 DF PROTO=TCP SPT=54140 DPT=9882 SEQ=757400563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE6DF80000000001030307) Feb 1 04:07:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43599 DF PROTO=TCP SPT=54140 DPT=9882 SEQ=757400563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE75F80000000001030307) Feb 1 04:07:58 localhost python3.9[124928]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:08:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49076 DF PROTO=TCP SPT=58042 DPT=9101 SEQ=1167215950 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE81BA0000000001030307) Feb 1 04:08:02 localhost python3.9[125022]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:08:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19441 DF PROTO=TCP SPT=45158 DPT=9105 SEQ=3758628857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE8DB80000000001030307) Feb 1 04:08:06 localhost python3.9[125116]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:08:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49079 DF PROTO=TCP SPT=58042 DPT=9101 SEQ=1167215950 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEE9D790000000001030307) Feb 1 04:08:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43601 DF PROTO=TCP SPT=54140 DPT=9882 SEQ=757400563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEEA5B80000000001030307) Feb 1 04:08:11 localhost python3.9[125210]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:08:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19905 DF PROTO=TCP SPT=47028 DPT=9100 SEQ=341365613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEEB9780000000001030307) Feb 1 04:08:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49080 DF PROTO=TCP SPT=58042 DPT=9101 SEQ=1167215950 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEEBDB80000000001030307) Feb 1 04:08:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19906 DF PROTO=TCP SPT=47028 DPT=9100 SEQ=341365613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEED9B80000000001030307) Feb 1 04:08:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19296 DF PROTO=TCP SPT=38398 DPT=9102 SEQ=2511193258 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEEDBB90000000001030307) Feb 1 04:08:23 localhost python3.9[125376]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:08:24 localhost python3.9[125481]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:08:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14376 DF PROTO=TCP SPT=57600 DPT=9882 SEQ=3806161460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEEE3380000000001030307) Feb 1 04:08:25 localhost python3.9[125554]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769936904.0325449-723-112515449416344/.source.json _original_basename=.ctc0vj9c follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:08:26 localhost python3.9[125646]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 1 04:08:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14377 DF PROTO=TCP SPT=57600 DPT=9882 SEQ=3806161460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEEEB380000000001030307) Feb 1 04:08:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45087 DF PROTO=TCP SPT=52226 DPT=9101 SEQ=3703711821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEEF6EB0000000001030307) Feb 1 04:08:32 localhost podman[125658]: 2026-02-01 09:08:26.207332719 +0000 UTC m=+0.045895671 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 1 04:08:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45089 DF PROTO=TCP SPT=52226 DPT=9101 SEQ=3703711821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF02F80000000001030307) Feb 1 04:08:33 localhost python3.9[125859]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 1 04:08:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45090 DF PROTO=TCP SPT=52226 DPT=9101 SEQ=3703711821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF12B80000000001030307) Feb 1 04:08:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19907 DF PROTO=TCP SPT=47028 DPT=9100 SEQ=341365613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF19B80000000001030307) Feb 1 04:08:40 localhost podman[125872]: 2026-02-01 09:08:33.3579347 +0000 UTC m=+0.044061694 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 1 04:08:42 localhost python3.9[126143]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 1 04:08:44 localhost podman[126157]: 2026-02-01 09:08:42.681085055 +0000 UTC m=+0.044602560 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 1 04:08:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52880 DF PROTO=TCP SPT=42362 DPT=9100 SEQ=1706049567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF2EB80000000001030307) Feb 1 04:08:45 localhost python3.9[126322]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 1 04:08:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45091 DF PROTO=TCP SPT=52226 DPT=9101 SEQ=3703711821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF33B80000000001030307) Feb 1 04:08:47 localhost podman[126334]: 2026-02-01 09:08:45.544942507 +0000 UTC m=+0.046798979 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:08:48 localhost python3.9[126496]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 1 04:08:49 localhost sshd[126522]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:08:51 localhost podman[126508]: 2026-02-01 09:08:48.56970651 +0000 UTC m=+0.033064577 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Feb 1 04:08:52 localhost python3.9[126690]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 1 04:08:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47502 DF PROTO=TCP SPT=52076 DPT=9102 SEQ=1679828955 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF4FB80000000001030307) Feb 1 04:08:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52881 DF PROTO=TCP SPT=42362 DPT=9100 SEQ=1706049567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF4FB80000000001030307) Feb 1 04:08:54 localhost podman[126703]: 2026-02-01 09:08:52.784016696 +0000 UTC m=+0.059908852 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Feb 1 04:08:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34934 DF PROTO=TCP SPT=48700 DPT=9882 SEQ=7629566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF58390000000001030307) Feb 1 04:08:55 localhost systemd-logind[759]: Session 39 logged out. Waiting for processes to exit. Feb 1 04:08:55 localhost systemd[1]: session-39.scope: Deactivated successfully. Feb 1 04:08:55 localhost systemd[1]: session-39.scope: Consumed 1min 29.434s CPU time. Feb 1 04:08:55 localhost systemd-logind[759]: Removed session 39. Feb 1 04:08:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34935 DF PROTO=TCP SPT=48700 DPT=9882 SEQ=7629566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF60390000000001030307) Feb 1 04:09:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16862 DF PROTO=TCP SPT=52834 DPT=9101 SEQ=3665724868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF6C1B0000000001030307) Feb 1 04:09:01 localhost sshd[126814]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:09:01 localhost systemd-logind[759]: New session 40 of user zuul. Feb 1 04:09:01 localhost systemd[1]: Started Session 40 of User zuul. Feb 1 04:09:02 localhost python3.9[126907]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:09:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16864 DF PROTO=TCP SPT=52834 DPT=9101 SEQ=3665724868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF78380000000001030307) Feb 1 04:09:04 localhost python3.9[127004]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Feb 1 04:09:05 localhost python3.9[127097]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:09:06 localhost python3.9[127151]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:09:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16865 DF PROTO=TCP SPT=52834 DPT=9101 SEQ=3665724868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF87F80000000001030307) Feb 1 04:09:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47503 DF PROTO=TCP SPT=52076 DPT=9102 SEQ=1679828955 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEF8FB80000000001030307) Feb 1 04:09:11 localhost python3.9[127501]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:09:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23242 DF PROTO=TCP SPT=50148 DPT=9100 SEQ=3359003279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFA3F80000000001030307) Feb 1 04:09:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16866 DF PROTO=TCP SPT=52834 DPT=9101 SEQ=3665724868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFA7B80000000001030307) Feb 1 04:09:16 localhost python3.9[127595]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:09:17 localhost python3.9[127689]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:09:19 localhost python3.9[127781]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Feb 1 04:09:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23243 DF PROTO=TCP SPT=50148 DPT=9100 SEQ=3359003279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFC3B80000000001030307) Feb 1 04:09:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61753 DF PROTO=TCP SPT=59156 DPT=9102 SEQ=950269591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFC5B80000000001030307) Feb 1 04:09:23 localhost kernel: SELinux: Converting 2757 SID table entries... Feb 1 04:09:23 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 04:09:23 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 04:09:23 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 04:09:23 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 04:09:23 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 04:09:23 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 04:09:23 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 04:09:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15058 DF PROTO=TCP SPT=53598 DPT=9882 SEQ=1394151374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFCD780000000001030307) Feb 1 04:09:25 localhost python3.9[127877]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:09:25 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=18 res=1 Feb 1 04:09:26 localhost python3.9[127975]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:09:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15059 DF PROTO=TCP SPT=53598 DPT=9882 SEQ=1394151374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFD5780000000001030307) Feb 1 04:09:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20897 DF PROTO=TCP SPT=48450 DPT=9101 SEQ=423996870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFE14A0000000001030307) Feb 1 04:09:30 localhost python3.9[128069]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:09:32 localhost python3.9[128314]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None Feb 1 04:09:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20899 DF PROTO=TCP SPT=48450 DPT=9101 SEQ=423996870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFED390000000001030307) Feb 1 04:09:33 localhost python3.9[128404]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:09:34 localhost python3.9[128498]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:09:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20900 DF PROTO=TCP SPT=48450 DPT=9101 SEQ=423996870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CEFFCF80000000001030307) Feb 1 04:09:38 localhost python3.9[128592]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:09:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61754 DF PROTO=TCP SPT=59156 DPT=9102 SEQ=950269591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF005B80000000001030307) Feb 1 04:09:42 localhost python3.9[128686]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 1 04:09:43 localhost systemd[1]: Reloading. Feb 1 04:09:43 localhost systemd-sysv-generator[128777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:09:43 localhost systemd-rc-local-generator[128772]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:09:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:09:44 localhost python3.9[128879]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:09:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58701 DF PROTO=TCP SPT=46376 DPT=9100 SEQ=170807514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF018F80000000001030307) Feb 1 04:09:45 localhost python3.9[128971]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:09:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20901 DF PROTO=TCP SPT=48450 DPT=9101 SEQ=423996870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF01DB80000000001030307) Feb 1 04:09:45 localhost python3.9[129065]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:09:46 localhost python3.9[129157]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:09:47 localhost python3.9[129264]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:09:48 localhost python3.9[129337]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936986.9402773-558-133285986951801/.source _original_basename=.tkvfwq66 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:09:48 localhost python3.9[129429]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:09:49 localhost python3.9[129521]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Feb 1 04:09:50 localhost python3.9[129613]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:09:51 localhost python3.9[129705]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:09:51 localhost python3.9[129778]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936990.8563442-684-32995670341914/.source.yaml _original_basename=.5yrlking follow=False checksum=0cadac3cfc033a4e07cfac59b43f6459e787700a force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:09:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58702 DF PROTO=TCP SPT=46376 DPT=9100 SEQ=170807514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF039B80000000001030307) Feb 1 04:09:52 localhost python3.9[129870]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Feb 1 04:09:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35637 DF PROTO=TCP SPT=56310 DPT=9102 SEQ=569192604 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF03BB80000000001030307) Feb 1 04:09:54 localhost ansible-async_wrapper.py[129975]: Invoked with j370543724420 300 /home/zuul/.ansible/tmp/ansible-tmp-1769936993.2495334-756-234084283831303/AnsiballZ_edpm_os_net_config.py _ Feb 1 04:09:54 localhost ansible-async_wrapper.py[129978]: Starting module and watcher Feb 1 04:09:54 localhost ansible-async_wrapper.py[129978]: Start watching 129979 (300) Feb 1 04:09:54 localhost ansible-async_wrapper.py[129979]: Start module (129979) Feb 1 04:09:54 localhost ansible-async_wrapper.py[129975]: Return async_wrapper task started. Feb 1 04:09:54 localhost python3.9[129980]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False Feb 1 04:09:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34114 DF PROTO=TCP SPT=36646 DPT=9882 SEQ=3749027993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF042B80000000001030307) Feb 1 04:09:55 localhost ansible-async_wrapper.py[129979]: Module complete (129979) Feb 1 04:09:56 localhost sshd[129994]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:09:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34115 DF PROTO=TCP SPT=36646 DPT=9882 SEQ=3749027993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF04AB80000000001030307) Feb 1 04:09:57 localhost python3.9[130086]: ansible-ansible.legacy.async_status Invoked with jid=j370543724420.129975 mode=status _async_dir=/root/.ansible_async Feb 1 04:09:58 localhost python3.9[130145]: ansible-ansible.legacy.async_status Invoked with jid=j370543724420.129975 mode=cleanup _async_dir=/root/.ansible_async Feb 1 04:09:59 localhost python3.9[130237]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:09:59 localhost ansible-async_wrapper.py[129978]: Done in kid B. Feb 1 04:09:59 localhost python3.9[130310]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936998.6060412-822-222494223559746/.source.returncode _original_basename=.7yaoj2qs follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:10:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63258 DF PROTO=TCP SPT=41722 DPT=9101 SEQ=650714647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0567A0000000001030307) Feb 1 04:10:00 localhost python3.9[130402]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:10:00 localhost python3.9[130475]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936999.8646004-870-172460139542988/.source.cfg _original_basename=.lwteuf6d follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:10:01 localhost python3.9[130567]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:10:01 localhost systemd[1]: Reloading Network Manager... Feb 1 04:10:01 localhost NetworkManager[5964]: [1769937001.8088] audit: op="reload" arg="0" pid=130571 uid=0 result="success" Feb 1 04:10:01 localhost NetworkManager[5964]: [1769937001.8096] config: signal: SIGHUP (no changes from disk) Feb 1 04:10:01 localhost systemd[1]: Reloaded Network Manager. Feb 1 04:10:02 localhost systemd-logind[759]: Session 40 logged out. Waiting for processes to exit. Feb 1 04:10:02 localhost systemd[1]: session-40.scope: Deactivated successfully. Feb 1 04:10:02 localhost systemd[1]: session-40.scope: Consumed 36.296s CPU time. Feb 1 04:10:02 localhost systemd-logind[759]: Removed session 40. Feb 1 04:10:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63260 DF PROTO=TCP SPT=41722 DPT=9101 SEQ=650714647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF062780000000001030307) Feb 1 04:10:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63261 DF PROTO=TCP SPT=41722 DPT=9101 SEQ=650714647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF072380000000001030307) Feb 1 04:10:07 localhost sshd[130586]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:10:07 localhost systemd-logind[759]: New session 41 of user zuul. Feb 1 04:10:07 localhost systemd[1]: Started Session 41 of User zuul. Feb 1 04:10:08 localhost python3.9[130679]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:10:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58703 DF PROTO=TCP SPT=46376 DPT=9100 SEQ=170807514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF079B80000000001030307) Feb 1 04:10:09 localhost python3.9[130773]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:10:11 localhost python3.9[130926]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:10:11 localhost systemd[1]: session-41.scope: Deactivated successfully. Feb 1 04:10:11 localhost systemd[1]: session-41.scope: Consumed 2.089s CPU time. Feb 1 04:10:11 localhost systemd-logind[759]: Session 41 logged out. Waiting for processes to exit. Feb 1 04:10:11 localhost systemd-logind[759]: Removed session 41. Feb 1 04:10:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5380 DF PROTO=TCP SPT=50580 DPT=9100 SEQ=3186269180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF08E380000000001030307) Feb 1 04:10:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63262 DF PROTO=TCP SPT=41722 DPT=9101 SEQ=650714647 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF091B80000000001030307) Feb 1 04:10:17 localhost sshd[130942]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:10:17 localhost systemd-logind[759]: New session 42 of user zuul. Feb 1 04:10:17 localhost systemd[1]: Started Session 42 of User zuul. Feb 1 04:10:18 localhost python3.9[131035]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:10:19 localhost python3.9[131129]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:10:20 localhost python3.9[131225]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:10:21 localhost python3.9[131279]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:10:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5381 DF PROTO=TCP SPT=50580 DPT=9100 SEQ=3186269180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0ADB90000000001030307) Feb 1 04:10:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61824 DF PROTO=TCP SPT=35044 DPT=9102 SEQ=64779434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0AFB80000000001030307) Feb 1 04:10:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62307 DF PROTO=TCP SPT=58628 DPT=9882 SEQ=4085182640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0B7F90000000001030307) Feb 1 04:10:25 localhost python3.9[131373]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:10:26 localhost python3.9[131528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62308 DF PROTO=TCP SPT=58628 DPT=9882 SEQ=4085182640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0BFF80000000001030307) Feb 1 04:10:27 localhost python3.9[131620]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:10:28 localhost python3.9[131724]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:10:28 localhost python3.9[131772]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:10:29 localhost python3.9[131864]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:10:29 localhost python3.9[131912]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:10:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20755 DF PROTO=TCP SPT=35090 DPT=9101 SEQ=1539754508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0CBA90000000001030307) Feb 1 04:10:30 localhost python3.9[132004]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:10:31 localhost python3.9[132096]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:10:31 localhost python3.9[132188]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:10:32 localhost python3.9[132280]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 1 04:10:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46799 DF PROTO=TCP SPT=42412 DPT=9105 SEQ=1286112133 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0D7B80000000001030307) Feb 1 04:10:33 localhost python3.9[132372]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:10:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20758 DF PROTO=TCP SPT=35090 DPT=9101 SEQ=1539754508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0E7780000000001030307) Feb 1 04:10:37 localhost python3.9[132466]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:10:38 localhost python3.9[132560]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61825 DF PROTO=TCP SPT=35044 DPT=9102 SEQ=64779434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF0EFB80000000001030307) Feb 1 04:10:39 localhost python3.9[132652]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:10:40 localhost python3.9[132744]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:10:41 localhost python3.9[132837]: ansible-service_facts Invoked Feb 1 04:10:41 localhost network[132854]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:10:41 localhost network[132855]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:10:41 localhost network[132856]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:10:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:10:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64916 DF PROTO=TCP SPT=36240 DPT=9100 SEQ=58349046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF103780000000001030307) Feb 1 04:10:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20759 DF PROTO=TCP SPT=35090 DPT=9101 SEQ=1539754508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF107B80000000001030307) Feb 1 04:10:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:10:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 5701 writes, 25K keys, 5701 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5701 writes, 740 syncs, 7.70 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:10:47 localhost python3.9[133237]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:10:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:10:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 4896 writes, 22K keys, 4896 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4896 writes, 685 syncs, 7.15 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:10:52 localhost python3.9[133396]: ansible-package_facts Invoked with manager=['auto'] strategy=first Feb 1 04:10:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64917 DF PROTO=TCP SPT=36240 DPT=9100 SEQ=58349046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF123B80000000001030307) Feb 1 04:10:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2267 DF PROTO=TCP SPT=57732 DPT=9102 SEQ=322635350 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF125B90000000001030307) Feb 1 04:10:54 localhost python3.9[133488]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:10:54 localhost python3.9[133563]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937053.6270294-651-266386345541443/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:10:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51371 DF PROTO=TCP SPT=41788 DPT=9882 SEQ=2413805837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF12CF90000000001030307) Feb 1 04:10:55 localhost python3.9[133657]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:10:56 localhost python3.9[133732]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937055.2826388-696-84094015174581/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:10:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51372 DF PROTO=TCP SPT=41788 DPT=9882 SEQ=2413805837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF134F80000000001030307) Feb 1 04:10:58 localhost python3.9[133826]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:10:59 localhost python3.9[133920]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:11:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46232 DF PROTO=TCP SPT=33402 DPT=9101 SEQ=3087516610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF140DA0000000001030307) Feb 1 04:11:01 localhost python3.9[133974]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:11:02 localhost python3.9[134068]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:11:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46234 DF PROTO=TCP SPT=33402 DPT=9101 SEQ=3087516610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF14CF80000000001030307) Feb 1 04:11:03 localhost python3.9[134122]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:11:03 localhost chronyd[25994]: chronyd exiting Feb 1 04:11:03 localhost systemd[1]: Stopping NTP client/server... Feb 1 04:11:03 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 1 04:11:03 localhost systemd[1]: Stopped NTP client/server. Feb 1 04:11:03 localhost systemd[1]: Starting NTP client/server... Feb 1 04:11:03 localhost chronyd[134130]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 1 04:11:03 localhost chronyd[134130]: Frequency -30.378 +/- 0.143 ppm read from /var/lib/chrony/drift Feb 1 04:11:03 localhost chronyd[134130]: Loaded seccomp filter (level 2) Feb 1 04:11:03 localhost systemd[1]: Started NTP client/server. Feb 1 04:11:04 localhost systemd[1]: session-42.scope: Deactivated successfully. Feb 1 04:11:04 localhost systemd[1]: session-42.scope: Consumed 28.337s CPU time. Feb 1 04:11:04 localhost systemd-logind[759]: Session 42 logged out. Waiting for processes to exit. Feb 1 04:11:04 localhost systemd-logind[759]: Removed session 42. Feb 1 04:11:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46235 DF PROTO=TCP SPT=33402 DPT=9101 SEQ=3087516610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF15CB80000000001030307) Feb 1 04:11:07 localhost sshd[134146]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:11:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64918 DF PROTO=TCP SPT=36240 DPT=9100 SEQ=58349046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF163B80000000001030307) Feb 1 04:11:09 localhost sshd[134148]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:11:09 localhost systemd-logind[759]: New session 43 of user zuul. Feb 1 04:11:09 localhost systemd[1]: Started Session 43 of User zuul. Feb 1 04:11:11 localhost python3.9[134241]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:11:12 localhost auditd[725]: Audit daemon rotating log files Feb 1 04:11:12 localhost python3.9[134337]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:13 localhost python3.9[134442]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:13 localhost python3.9[134490]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.8cnw1srb recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60424 DF PROTO=TCP SPT=41868 DPT=9100 SEQ=2641528776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF178B90000000001030307) Feb 1 04:11:15 localhost python3.9[134582]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46236 DF PROTO=TCP SPT=33402 DPT=9101 SEQ=3087516610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF17DB80000000001030307) Feb 1 04:11:15 localhost python3.9[134657]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937074.5557373-138-30136060408748/.source _original_basename=.pg0bj6nm follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:16 localhost python3.9[134749]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:11:17 localhost python3.9[134841]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:17 localhost python3.9[134914]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937076.6562629-210-1491211118634/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 04:11:18 localhost python3.9[135006]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:18 localhost python3.9[135079]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937077.8101635-210-275370795799894/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 04:11:19 localhost python3.9[135171]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:20 localhost python3.9[135263]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:20 localhost python3.9[135336]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937079.9601839-321-33262707259834/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:21 localhost python3.9[135428]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:22 localhost python3.9[135501]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937081.313655-366-137400928378380/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60425 DF PROTO=TCP SPT=41868 DPT=9100 SEQ=2641528776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF199B80000000001030307) Feb 1 04:11:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59208 DF PROTO=TCP SPT=59686 DPT=9102 SEQ=624667641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF199B90000000001030307) Feb 1 04:11:23 localhost python3.9[135593]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:11:23 localhost systemd[1]: Reloading. Feb 1 04:11:23 localhost systemd-rc-local-generator[135620]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:11:23 localhost systemd-sysv-generator[135624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:11:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:11:23 localhost systemd[1]: Reloading. Feb 1 04:11:23 localhost systemd-rc-local-generator[135656]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:11:23 localhost systemd-sysv-generator[135660]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:11:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:11:24 localhost systemd[1]: Starting EDPM Container Shutdown... Feb 1 04:11:24 localhost systemd[1]: Finished EDPM Container Shutdown. Feb 1 04:11:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24375 DF PROTO=TCP SPT=58360 DPT=9882 SEQ=1109423541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1A2380000000001030307) Feb 1 04:11:26 localhost python3.9[135763]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:26 localhost python3.9[135836]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937085.4963076-435-212584477537467/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24376 DF PROTO=TCP SPT=58360 DPT=9882 SEQ=1109423541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1AA380000000001030307) Feb 1 04:11:27 localhost python3.9[135928]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:27 localhost python3.9[136001]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937086.855538-480-90190561173423/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:28 localhost python3.9[136093]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:11:28 localhost systemd[1]: Reloading. Feb 1 04:11:28 localhost systemd-rc-local-generator[136118]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:11:28 localhost systemd-sysv-generator[136124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:11:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:11:29 localhost systemd[1]: Starting Create netns directory... Feb 1 04:11:29 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 04:11:29 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 04:11:29 localhost systemd[1]: Finished Create netns directory. Feb 1 04:11:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58389 DF PROTO=TCP SPT=41830 DPT=9101 SEQ=286495528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1B60A0000000001030307) Feb 1 04:11:30 localhost python3.9[136225]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:11:30 localhost network[136242]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:11:30 localhost network[136243]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:11:30 localhost network[136244]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:11:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:11:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7686 DF PROTO=TCP SPT=46354 DPT=9105 SEQ=1248725545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1C1B90000000001030307) Feb 1 04:11:36 localhost python3.9[136445]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:36 localhost python3.9[136520]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937095.6745431-603-105016692793360/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58392 DF PROTO=TCP SPT=41830 DPT=9101 SEQ=286495528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1D1B90000000001030307) Feb 1 04:11:37 localhost python3.9[136613]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:11:37 localhost systemd[1]: Reloading OpenSSH server daemon... Feb 1 04:11:37 localhost systemd[1]: Reloaded OpenSSH server daemon. Feb 1 04:11:37 localhost sshd[118161]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:11:38 localhost python3.9[136709]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59209 DF PROTO=TCP SPT=59686 DPT=9102 SEQ=624667641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1D9B80000000001030307) Feb 1 04:11:39 localhost python3.9[136801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:39 localhost python3.9[136874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937098.767614-696-51673997834903/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:40 localhost python3.9[136966]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Feb 1 04:11:41 localhost systemd[1]: Starting Time & Date Service... Feb 1 04:11:41 localhost systemd[1]: Started Time & Date Service. Feb 1 04:11:41 localhost python3.9[137062]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:42 localhost python3.9[137154]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:43 localhost python3.9[137227]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937102.1517456-801-168492059946644/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:44 localhost python3.9[137319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60027 DF PROTO=TCP SPT=60514 DPT=9100 SEQ=2392602543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1EDB80000000001030307) Feb 1 04:11:44 localhost python3.9[137392]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937103.5606797-846-29552462748011/.source.yaml _original_basename=.npa4lih6 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:45 localhost python3.9[137484]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58393 DF PROTO=TCP SPT=41830 DPT=9101 SEQ=286495528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF1F1B80000000001030307) Feb 1 04:11:45 localhost python3.9[137559]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937104.8033028-891-197038240117248/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:46 localhost python3.9[137651]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:11:47 localhost python3.9[137744]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:11:48 localhost python3[137837]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 1 04:11:48 localhost python3.9[137929]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:49 localhost python3.9[138002]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937108.4401076-1008-197887671576862/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:50 localhost python3.9[138094]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:50 localhost python3.9[138167]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937109.651158-1053-156908992856775/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:51 localhost python3.9[138259]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:51 localhost python3.9[138362]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937110.9776566-1098-235462751933346/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60028 DF PROTO=TCP SPT=60514 DPT=9100 SEQ=2392602543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF20DB80000000001030307) Feb 1 04:11:52 localhost python3.9[138487]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44764 DF PROTO=TCP SPT=53172 DPT=9102 SEQ=3262378388 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF20FB80000000001030307) Feb 1 04:11:53 localhost python3.9[138616]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937112.2692332-1143-183871329586816/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:53 localhost podman[138632]: Feb 1 04:11:53 localhost podman[138632]: 2026-02-01 09:11:53.38232784 +0000 UTC m=+0.080353781 container create 3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_darwin, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, name=rhceph, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux ) Feb 1 04:11:53 localhost systemd[1]: Started libpod-conmon-3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6.scope. Feb 1 04:11:53 localhost systemd[1]: Started libcrun container. Feb 1 04:11:53 localhost podman[138632]: 2026-02-01 09:11:53.349493701 +0000 UTC m=+0.047519682 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:11:53 localhost podman[138632]: 2026-02-01 09:11:53.46494959 +0000 UTC m=+0.162975531 container init 3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_darwin, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=1764794109, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, distribution-scope=public) Feb 1 04:11:53 localhost podman[138632]: 2026-02-01 09:11:53.476588448 +0000 UTC m=+0.174622779 container start 3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_darwin, vcs-type=git, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., version=7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:11:53 localhost podman[138632]: 2026-02-01 09:11:53.476929498 +0000 UTC m=+0.174955499 container attach 3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_darwin, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1764794109, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., GIT_CLEAN=True) Feb 1 04:11:53 localhost systemd[1]: libpod-3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6.scope: Deactivated successfully. Feb 1 04:11:53 localhost affectionate_darwin[138661]: 167 167 Feb 1 04:11:53 localhost podman[138632]: 2026-02-01 09:11:53.484632305 +0000 UTC m=+0.182658246 container died 3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_darwin, version=7, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:11:53 localhost podman[138666]: 2026-02-01 09:11:53.578550841 +0000 UTC m=+0.080197795 container remove 3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_darwin, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, distribution-scope=public, architecture=x86_64) Feb 1 04:11:53 localhost systemd[1]: libpod-conmon-3e52bbf5620bc80ed08c8d9ab5de296027330dbee2046735329bee257af411c6.scope: Deactivated successfully. Feb 1 04:11:53 localhost podman[138685]: Feb 1 04:11:53 localhost podman[138685]: 2026-02-01 09:11:53.757504431 +0000 UTC m=+0.066281488 container create adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_kirch, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.buildah.version=1.41.4, version=7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph) Feb 1 04:11:53 localhost systemd[1]: Started libpod-conmon-adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e.scope. Feb 1 04:11:53 localhost systemd[1]: Started libcrun container. Feb 1 04:11:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14c4b124d77d3e805431229b536f54ca94d73e8139aa24f4de2e8a69fb06ae47/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 04:11:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14c4b124d77d3e805431229b536f54ca94d73e8139aa24f4de2e8a69fb06ae47/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:11:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14c4b124d77d3e805431229b536f54ca94d73e8139aa24f4de2e8a69fb06ae47/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 04:11:53 localhost podman[138685]: 2026-02-01 09:11:53.721798663 +0000 UTC m=+0.030575760 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:11:53 localhost podman[138685]: 2026-02-01 09:11:53.82190749 +0000 UTC m=+0.130684517 container init adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_kirch, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4) Feb 1 04:11:53 localhost podman[138685]: 2026-02-01 09:11:53.831140984 +0000 UTC m=+0.139918031 container start adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_kirch, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7) Feb 1 04:11:53 localhost podman[138685]: 2026-02-01 09:11:53.831420932 +0000 UTC m=+0.140197959 container attach adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_kirch, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1764794109, distribution-scope=public, architecture=x86_64, RELEASE=main, vcs-type=git) Feb 1 04:11:54 localhost python3.9[138788]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:11:54 localhost systemd[1]: var-lib-containers-storage-overlay-226f9fe4330a90aada9ab313f87b879c72623a93653df557dd955943be1be78e-merged.mount: Deactivated successfully. Feb 1 04:11:54 localhost distracted_kirch[138733]: [ Feb 1 04:11:54 localhost distracted_kirch[138733]: { Feb 1 04:11:54 localhost distracted_kirch[138733]: "available": false, Feb 1 04:11:54 localhost distracted_kirch[138733]: "ceph_device": false, Feb 1 04:11:54 localhost distracted_kirch[138733]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 1 04:11:54 localhost distracted_kirch[138733]: "lsm_data": {}, Feb 1 04:11:54 localhost distracted_kirch[138733]: "lvs": [], Feb 1 04:11:54 localhost distracted_kirch[138733]: "path": "/dev/sr0", Feb 1 04:11:54 localhost distracted_kirch[138733]: "rejected_reasons": [ Feb 1 04:11:54 localhost distracted_kirch[138733]: "Has a FileSystem", Feb 1 04:11:54 localhost distracted_kirch[138733]: "Insufficient space (<5GB)" Feb 1 04:11:54 localhost distracted_kirch[138733]: ], Feb 1 04:11:54 localhost distracted_kirch[138733]: "sys_api": { Feb 1 04:11:54 localhost distracted_kirch[138733]: "actuators": null, Feb 1 04:11:54 localhost distracted_kirch[138733]: "device_nodes": "sr0", Feb 1 04:11:54 localhost distracted_kirch[138733]: "human_readable_size": "482.00 KB", Feb 1 04:11:54 localhost distracted_kirch[138733]: "id_bus": "ata", Feb 1 04:11:54 localhost distracted_kirch[138733]: "model": "QEMU DVD-ROM", Feb 1 04:11:54 localhost distracted_kirch[138733]: "nr_requests": "2", Feb 1 04:11:54 localhost distracted_kirch[138733]: "partitions": {}, Feb 1 04:11:54 localhost distracted_kirch[138733]: "path": "/dev/sr0", Feb 1 04:11:54 localhost distracted_kirch[138733]: "removable": "1", Feb 1 04:11:54 localhost distracted_kirch[138733]: "rev": "2.5+", Feb 1 04:11:54 localhost distracted_kirch[138733]: "ro": "0", Feb 1 04:11:54 localhost distracted_kirch[138733]: "rotational": "1", Feb 1 04:11:54 localhost distracted_kirch[138733]: "sas_address": "", Feb 1 04:11:54 localhost distracted_kirch[138733]: "sas_device_handle": "", Feb 1 04:11:54 localhost distracted_kirch[138733]: "scheduler_mode": "mq-deadline", Feb 1 04:11:54 localhost distracted_kirch[138733]: "sectors": 0, Feb 1 04:11:54 localhost distracted_kirch[138733]: "sectorsize": "2048", Feb 1 04:11:54 localhost distracted_kirch[138733]: "size": 493568.0, Feb 1 04:11:54 localhost distracted_kirch[138733]: "support_discard": "0", Feb 1 04:11:54 localhost distracted_kirch[138733]: "type": "disk", Feb 1 04:11:54 localhost distracted_kirch[138733]: "vendor": "QEMU" Feb 1 04:11:54 localhost distracted_kirch[138733]: } Feb 1 04:11:54 localhost distracted_kirch[138733]: } Feb 1 04:11:54 localhost distracted_kirch[138733]: ] Feb 1 04:11:54 localhost systemd[1]: libpod-adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e.scope: Deactivated successfully. Feb 1 04:11:54 localhost podman[140181]: 2026-02-01 09:11:54.837917207 +0000 UTC m=+0.065003429 container died adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_kirch, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109) Feb 1 04:11:54 localhost systemd[1]: tmp-crun.MOnbmM.mount: Deactivated successfully. Feb 1 04:11:54 localhost systemd[1]: var-lib-containers-storage-overlay-14c4b124d77d3e805431229b536f54ca94d73e8139aa24f4de2e8a69fb06ae47-merged.mount: Deactivated successfully. Feb 1 04:11:54 localhost podman[140181]: 2026-02-01 09:11:54.878912946 +0000 UTC m=+0.105999148 container remove adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_kirch, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.41.4) Feb 1 04:11:54 localhost python3.9[139987]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937113.744282-1188-182347789226811/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:54 localhost systemd[1]: libpod-conmon-adab5fb3b82ec5802d85797abe502f91deeaa69f49143e6794ebb904720e899e.scope: Deactivated successfully. Feb 1 04:11:55 localhost python3.9[140286]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24379 DF PROTO=TCP SPT=58360 DPT=9882 SEQ=1109423541 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF219B80000000001030307) Feb 1 04:11:56 localhost python3.9[140393]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:11:57 localhost python3.9[140488]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51376 DF PROTO=TCP SPT=41788 DPT=9882 SEQ=2413805837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF223B80000000001030307) Feb 1 04:11:58 localhost python3.9[140581]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:58 localhost python3.9[140673]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:11:59 localhost python3.9[140765]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Feb 1 04:12:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30466 DF PROTO=TCP SPT=50086 DPT=9101 SEQ=1209751873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF22B3A0000000001030307) Feb 1 04:12:00 localhost python3.9[140858]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Feb 1 04:12:00 localhost systemd[1]: session-43.scope: Deactivated successfully. Feb 1 04:12:00 localhost systemd[1]: session-43.scope: Consumed 28.864s CPU time. Feb 1 04:12:00 localhost systemd-logind[759]: Session 43 logged out. Waiting for processes to exit. Feb 1 04:12:00 localhost systemd-logind[759]: Removed session 43. Feb 1 04:12:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46238 DF PROTO=TCP SPT=33402 DPT=9101 SEQ=3087516610 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF23BB90000000001030307) Feb 1 04:12:06 localhost sshd[140875]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:12:06 localhost systemd-logind[759]: New session 44 of user zuul. Feb 1 04:12:06 localhost systemd[1]: Started Session 44 of User zuul. Feb 1 04:12:07 localhost python3.9[140970]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Feb 1 04:12:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42312 DF PROTO=TCP SPT=50892 DPT=9100 SEQ=423500956 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF247370000000001030307) Feb 1 04:12:08 localhost python3.9[141062]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:12:10 localhost python3.9[141156]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Feb 1 04:12:11 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 1 04:12:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60427 DF PROTO=TCP SPT=41868 DPT=9100 SEQ=2641528776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF257B80000000001030307) Feb 1 04:12:11 localhost python3.9[141250]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.qich70c6 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:12:12 localhost python3.9[141325]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.qich70c6 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937131.1970017-189-85148328673375/.source.qich70c6 _original_basename=.j6jsphiz follow=False checksum=b6259656501c187ae53f530254d9fd01725b4ecf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:12:14 localhost python3.9[141417]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:12:16 localhost python3.9[141509]: ansible-ansible.builtin.blockinfile Invoked with block=np0005604212.localdomain,192.168.122.106,np0005604212* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCx/MKX//74FswFkw1c1lfM5mahSRoD4B8bhCZSm2/IQ//syuq+Qpi1sEoMv/N1mOrU8atXNtYkVNozl/ypDe2YJkUS8OTt37bT9A7XnBlfFSc5OwXS7VGHpVWbiMbImJibSV7HjoQP0yA8SvCJCcrI3Eh14+cna8tT1rJ9lOFRHvxLfG52XnzFiNUVDU+TG3uRtWEjY5epI8j/U73tEqdP4OAk7ZQ9riN1nllCCIs9FOErOEw14VW+151TbOCzcm9kvzeQMit9jPXTGqmTPKoidZFLhJwEAXq4M9+DFfKQWkVSqfcU3cvPz6S03lUcpPWiJxgGZiIPXxCdRjvI3bKCm898lFYwZq8EfdAwUFMyhmz4GHSyhMwqZWE46cikXf/skoSrEF8ji3NjmyQL7T304iKenZca6rHDI56veO0+PTzZj/pBiaWBWXlqF0WQLAn804z3yapsLNuR8R4EaREmk1Tc2ESg1//73pCUypwEMQWESHsAJ/LCHhyqNHY6Bjc=#012np0005604212.localdomain,192.168.122.106,np0005604212* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE8ydwus/1P6AnrixkRz4PJNoZXio9ATjx1wpGE9aUxy#012np0005604212.localdomain,192.168.122.106,np0005604212* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHuZj1kjh43u8MkLoV7TID8opUYqcB9nbV+TEcV1Khgm9NhSBcQeUlB5GJecVMFtUp1FQn9l3Oxy0aNJL0spiWE=#012np0005604210.localdomain,192.168.122.104,np0005604210* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDeVlqpmEgZX6yoZkE7SzVbEM6MqJe/9qDZPPgFZPb/N85k+uB3cINsoq0pMJYeKjcKY8H56WyuNkVVwVHaouZnJCN4p1rCJmATIDieU8QMDwGucQpbrNRrQWheWQDkmHNIPOxnUDCRgEzDfYiaE4prLHMPKtf8XJAKUKVd6lpZrVSCovGz0UC3U1Le/0N1PJOi4kYEuipVrcfoYHC63A32I+w+7tybU8Rpknhc/UHhdn39PBGuAhbkSf2JEJbLLzLaPkZXT6HOPiBUT9jWKnymCGEcfPjIWOkeelx3fkPoXZCtnYHlSoQSkCVsUmXgHNj7X3+6sJi9+iV/+8jRWQyk6aCC+HjXDhSwxbBUaM9AOimJ9EK7vo8/IK9pQ3gNsEct6rHuvGytACNMWpaT5sRRaVEnS8uz/PL8urB6+59GYGunjAaw8lCQcxw+VNVJaLtj+BpVJZA2EA6XE4fwq7v0s9u0ApIMSyV3DcYzIcDFlT11I5g3RM8vZNipXfnub3U=#012np0005604210.localdomain,192.168.122.104,np0005604210* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOlN+6Wna5zexGzaC7+fSuZYqptFJJzfc4fNurRaPmwC#012np0005604210.localdomain,192.168.122.104,np0005604210* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMf1D0EfcBESlFDd0NV4yvsDLeyI7zSTGShGHjV17TDeMwOZQ9X97P3K+p+QICvUvg8AXGXxFhArHCUmm+iJ0Q8=#012np0005604209.localdomain,192.168.122.103,np0005604209* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAdXF2/8XBq3bWgr/9swIkzjlkm7PzpC1vdYXglaExGeIUwK5n05/HLobUMrYjOh6yE81+tctBT51wuPLw9qOGf4X3lRx3x0AHUqWSs00OL5nZsMRAd6PknZVyeCWf9jv13mVWIExCYbP8e4VK4M3w1m2xSLFd1aHtGkEUYJKCmacxrxFu2opq+kNCclpMC0BlFeSeX/NZeGwcfVCEyP46JVB9pNDo6D4s98FzzQNtG4DTv8NqE0S8Fj44dajq/80IKXeVEbhVmBikwFGMMEHhsRass2m0Q0rBw1Cv2jqW9hrTO1AWHY2aNDDqr6cKttP27XKfc/unDFFDb0mcc/HRa8JAUYEvuO0FIV6n28+Q5hWoYHAZfMU15U/bQPN1UxbF/MmSIZWvwY+vzCJ+icSJ9qfhDfbd1DttRuV0F3Jdi0jq01TyyPdOz8qT7kKSftD3Awn6BNLlseR8MaOTS+YF4fOnSP/xzj0B+nx/nr5Mrq8+QzKb2YyqdMfWWMGdCw8=#012np0005604209.localdomain,192.168.122.103,np0005604209* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEyfGu/WqJIvC6oouYQjgcrJPk9Bg07JDIkt1JPKTeA0#012np0005604209.localdomain,192.168.122.103,np0005604209* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH4jrW0M0jOqWvBkwMTs5aJ7MoUwB68xLOHVc4M2y1jfTW9cs2+E3JaFwH6xJLpPXRNwbxblwTFdTeLzxwq3Nwk=#012np0005604211.localdomain,192.168.122.105,np0005604211* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCQ5JUOdiESLpaYomijw3u9LxHN4VxpmenW9EczyVvVdofuEESAIR1Q8BIVkW7gxgVyrzHxOpbaoAS+aZaKazruu7/chC8MkDw1lvfeyQwMZax6UziUan2wIFVTaCc7kITOHrdWkJm+OIvCs/ImtkSgsTmvTiQedvs86ME3gHNyA+7taoDXnH6UCB6d5ex6PzwXsKI03iUVWFfsGP3ZU7r52IBwgrLG+VplbaPBRNNP/RvKULVsokG3UCMd3pjHv3VYBdXPYTFOPf666ZEuxEz+Frz43oXzEhr4W61RN70cAFJDDFoOmBDxXzZqrmF7r1vSV3ojl+aHaVLCGL4Wnjrp9wl5Zq8XCGN/7ttzaZKrjj/flccfBEiYL9odgqp92EjmxsRqG4bFq/nEzS/DTJ88QQVpGQNC2T6bElJVdBIrpZAyv7n5HlwNQwfsltQtzbqe1E32azZb1wq13ajV9Ii7QrVd81nGYFM79NqiVVbXs5NypsJOMQ6ZoqyHK5+yyHk=#012np0005604211.localdomain,192.168.122.105,np0005604211* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILEIcduNL0DMEDOErXXJ0uk29DlGSUk7f/QOEFebs4e8#012np0005604211.localdomain,192.168.122.105,np0005604211* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCGGOVXInAZnlCFh3rgVH6nrUWtitrkOeovDtC1WeeR/gHrJ+susCZPN3v3pAe5flAEf/hpjySdS/u1PmS0N8Ho=#012np0005604215.localdomain,192.168.122.108,np0005604215* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/jKlZ/vxfazmNjpekfENGpQi8TTD6ErYy0BH9P8CRIiiKVdA/53XGSAQlY17b4tT5hzyHsUuXDmbv5R98FSy/Fi8F4KrjgogVPhd/zYoMrffr9ydwv+ih2mIyCPjZC+N92i92gM2OBHBXj5vqyh5yl1t4H1LhFab7P/m42K75mcTytGvGTLKXZbcs/1Ot/APGrs5wqg/c9XFQtgBEn6ttSKQ9caqbgUw88VGRkzaHvzheQvtIjZL0AwigTS24tqFx+bF+liSnSaYk1R8TKe1yMNODv5OCUmFYvPqls4Y3AQkpuroQQXHcQCe0QPuz9nGgPebNOxyTHsK66oDWIUskoYIbrZZhjDxlpdzJ+POEU/jXtGox0/0wlpRK7jNN6r4Fzx6uIzxB5SWn/UJ4BYS853pUsC32TeD0pZXfUAzOGUOzQfvYkUCElyRi8zDN4ubwEWnxvCEPaAFihafbviqQwLNFFmth36owDHV2zU/Q/BtW8vrwfx0cPr2A4WvQvp8=#012np0005604215.localdomain,192.168.122.108,np0005604215* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAqxjQs+R8e7wYqi9vXJigqVZC1H7cyvu0Lob0wgHHpY#012np0005604215.localdomain,192.168.122.108,np0005604215* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBpZtz+gA3A28TfIAE9+rHy4sghRBF4nh1U9zBwiez8FWMv0OjVQriiYnYh6sbsEW0tK+yZBRm7xEpd3W14ioec=#012np0005604213.localdomain,192.168.122.107,np0005604213* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDhh44DuXnO4hBZJvT1vLnO8ZhT8GKLkBI0M+Q/lXSbHymnCyNerLMqVRhTb5ZUw07lkP6FtBJS95SUtdJuAbUi4jphShtJfBdicoa+uGqI1icHUQCbtCAACtas0lGeGi5q/q1LfzeuKh+LTRj60W+r2OZoChKxeSWYBQ8gIScKe1HgVCJVEESXwNv4CBs6ffOWVYHE+3JDUA3AN3nX931xw4oLMBkwi0q4sNh9Sb0oS79OX+dKdlGfnPLLWKF9QrLrHYdHVkKtPre9d1BdNkl38gRE45uwrAAxXBfeZjbzzfbUlWb54SZwL8P2ej29L5VAbE/97j1HD6+kUZ5wFb6v9oJyFwq8udFDqO1SUMkW4t1VmwD5G4rIU2+u0yHd4H7//fgbf8WAhPv1Qx5tXEqB6LIHqYCz7RekNQO5Xv8ge/gVMzzlxB0DJP6a4DJ8E0/Djnyzw81L2fmyeriPLqt/n/wHscNr1RRI4T1X2iINRwk5QfrxwTEHhJ00FY1kB90=#012np0005604213.localdomain,192.168.122.107,np0005604213* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHpQ8q5SipY+Tg88mzREiMhmtuvQNv/rHiJfQhVqjy49#012np0005604213.localdomain,192.168.122.107,np0005604213* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM6lbWtwCks630IMm3N6slgTXAS2/BDd/gLT/86gsZQSUwulBMm6OKfJ9eje+B7RGiNR4je3u2+SDaZwwywpAos=#012 create=True mode=0644 path=/tmp/ansible.qich70c6 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:12:17 localhost python3.9[141601]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.qich70c6' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:12:18 localhost python3.9[141695]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.qich70c6 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:12:19 localhost systemd[1]: session-44.scope: Deactivated successfully. Feb 1 04:12:19 localhost systemd[1]: session-44.scope: Consumed 4.251s CPU time. Feb 1 04:12:19 localhost systemd-logind[759]: Session 44 logged out. Waiting for processes to exit. Feb 1 04:12:19 localhost systemd-logind[759]: Removed session 44. Feb 1 04:12:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10763 DF PROTO=TCP SPT=53560 DPT=9882 SEQ=1773662922 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF288950000000001030307) Feb 1 04:12:25 localhost sshd[141710]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:12:25 localhost systemd-logind[759]: New session 45 of user zuul. Feb 1 04:12:25 localhost systemd[1]: Started Session 45 of User zuul. Feb 1 04:12:26 localhost python3.9[141803]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:12:27 localhost python3.9[141899]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 1 04:12:28 localhost python3.9[141993]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:12:29 localhost python3.9[142086]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:12:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59203 DF PROTO=TCP SPT=57582 DPT=9101 SEQ=2667256661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2A0690000000001030307) Feb 1 04:12:30 localhost python3.9[142179]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:12:31 localhost python3.9[142273]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:12:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41993 DF PROTO=TCP SPT=34674 DPT=9105 SEQ=3941399722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2A6100000000001030307) Feb 1 04:12:31 localhost python3.9[142368]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:12:32 localhost systemd[1]: session-45.scope: Deactivated successfully. Feb 1 04:12:32 localhost systemd[1]: session-45.scope: Consumed 3.752s CPU time. Feb 1 04:12:32 localhost systemd-logind[759]: Session 45 logged out. Waiting for processes to exit. Feb 1 04:12:32 localhost systemd-logind[759]: Removed session 45. Feb 1 04:12:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41994 DF PROTO=TCP SPT=34674 DPT=9105 SEQ=3941399722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2AA380000000001030307) Feb 1 04:12:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41995 DF PROTO=TCP SPT=34674 DPT=9105 SEQ=3941399722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2B2380000000001030307) Feb 1 04:12:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45327 DF PROTO=TCP SPT=43572 DPT=9100 SEQ=3961205776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2BC670000000001030307) Feb 1 04:12:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29824 DF PROTO=TCP SPT=36334 DPT=9102 SEQ=1294197014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2BE020000000001030307) Feb 1 04:12:37 localhost sshd[142383]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:12:37 localhost systemd-logind[759]: New session 46 of user zuul. Feb 1 04:12:37 localhost systemd[1]: Started Session 46 of User zuul. Feb 1 04:12:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45328 DF PROTO=TCP SPT=43572 DPT=9100 SEQ=3961205776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2C0790000000001030307) Feb 1 04:12:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29825 DF PROTO=TCP SPT=36334 DPT=9102 SEQ=1294197014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2C1F80000000001030307) Feb 1 04:12:38 localhost python3.9[142476]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:12:40 localhost python3.9[142572]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:12:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45329 DF PROTO=TCP SPT=43572 DPT=9100 SEQ=3961205776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2C8780000000001030307) Feb 1 04:12:41 localhost python3.9[142626]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 1 04:12:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45330 DF PROTO=TCP SPT=43572 DPT=9100 SEQ=3961205776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2D8380000000001030307) Feb 1 04:12:45 localhost python3.9[142718]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:12:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41997 DF PROTO=TCP SPT=34674 DPT=9105 SEQ=3941399722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2E1B80000000001030307) Feb 1 04:12:46 localhost python3.9[142811]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:12:47 localhost python3.9[142903]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:12:48 localhost python3.9[142995]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:12:49 localhost python3.9[143085]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 1 04:12:49 localhost python3.9[143175]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:12:50 localhost python3.9[143267]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:12:51 localhost systemd[1]: session-46.scope: Deactivated successfully. Feb 1 04:12:51 localhost systemd[1]: session-46.scope: Consumed 8.980s CPU time. Feb 1 04:12:51 localhost systemd-logind[759]: Session 46 logged out. Waiting for processes to exit. Feb 1 04:12:51 localhost systemd-logind[759]: Removed session 46. Feb 1 04:12:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45331 DF PROTO=TCP SPT=43572 DPT=9100 SEQ=3961205776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2F7B90000000001030307) Feb 1 04:12:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29828 DF PROTO=TCP SPT=36334 DPT=9102 SEQ=1294197014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF2F9B80000000001030307) Feb 1 04:12:53 localhost sshd[143284]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:12:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52299 DF PROTO=TCP SPT=51754 DPT=9882 SEQ=21547531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF301B80000000001030307) Feb 1 04:12:55 localhost sshd[143286]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:12:55 localhost systemd-logind[759]: New session 47 of user zuul. Feb 1 04:12:55 localhost systemd[1]: Started Session 47 of User zuul. Feb 1 04:12:56 localhost python3.9[143442]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:12:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52300 DF PROTO=TCP SPT=51754 DPT=9882 SEQ=21547531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF309B90000000001030307) Feb 1 04:12:59 localhost python3.9[143553]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7871 DF PROTO=TCP SPT=46754 DPT=9101 SEQ=3537477505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3159A0000000001030307) Feb 1 04:13:00 localhost python3.9[143645]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:00 localhost python3.9[143718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937179.3981357-178-257915741331582/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:01 localhost python3.9[143810]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:02 localhost python3.9[143902]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:02 localhost python3.9[143975]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937181.5932424-249-85746574092442/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7873 DF PROTO=TCP SPT=46754 DPT=9101 SEQ=3537477505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF321B80000000001030307) Feb 1 04:13:03 localhost python3.9[144067]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:03 localhost python3.9[144159]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:04 localhost python3.9[144232]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937183.3521478-319-113370914325321/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:05 localhost python3.9[144324]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:05 localhost python3.9[144416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:06 localhost python3.9[144489]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937185.343249-392-160854896830101/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:07 localhost python3.9[144581]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7874 DF PROTO=TCP SPT=46754 DPT=9101 SEQ=3537477505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF331790000000001030307) Feb 1 04:13:07 localhost python3.9[144673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:08 localhost python3.9[144746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937187.3650365-464-106681581948446/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:09 localhost python3.9[144838]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29829 DF PROTO=TCP SPT=36334 DPT=9102 SEQ=1294197014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF339B80000000001030307) Feb 1 04:13:09 localhost python3.9[144930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:10 localhost python3.9[145003]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937189.1921127-535-19584634781487/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:10 localhost python3.9[145095]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:11 localhost python3.9[145187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:12 localhost python3.9[145260]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937190.9846036-605-269809395186565/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:12 localhost python3.9[145352]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:13 localhost sshd[145445]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:13:13 localhost chronyd[134130]: Selected source 23.133.168.246 (pool.ntp.org) Feb 1 04:13:13 localhost python3.9[145444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:13 localhost sshd[145459]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:13:14 localhost python3.9[145519]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937192.9329648-674-5264303027568/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62599 DF PROTO=TCP SPT=46144 DPT=9100 SEQ=1413544078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF34D780000000001030307) Feb 1 04:13:14 localhost systemd[1]: session-47.scope: Deactivated successfully. Feb 1 04:13:14 localhost systemd[1]: session-47.scope: Consumed 11.243s CPU time. Feb 1 04:13:14 localhost systemd-logind[759]: Session 47 logged out. Waiting for processes to exit. Feb 1 04:13:14 localhost systemd-logind[759]: Removed session 47. Feb 1 04:13:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7875 DF PROTO=TCP SPT=46754 DPT=9101 SEQ=3537477505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF351B80000000001030307) Feb 1 04:13:20 localhost sshd[145534]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:13:20 localhost systemd-logind[759]: New session 48 of user zuul. Feb 1 04:13:20 localhost systemd[1]: Started Session 48 of User zuul. Feb 1 04:13:21 localhost python3.9[145629]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:22 localhost python3.9[145721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62600 DF PROTO=TCP SPT=46144 DPT=9100 SEQ=1413544078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF36DB90000000001030307) Feb 1 04:13:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59889 DF PROTO=TCP SPT=44278 DPT=9102 SEQ=1578547669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF36FB80000000001030307) Feb 1 04:13:23 localhost python3.9[145794]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937201.8216746-57-23284232238713/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=814f759dcc97f4b50c85badaa6f3819c2533c70a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:23 localhost python3.9[145886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:24 localhost python3.9[145959]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937203.3046126-57-101923005234772/.source.conf _original_basename=ceph.conf follow=False checksum=6c8f40813464a566eca7252d9e693fc8375e148c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:24 localhost sshd[145970]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:13:24 localhost systemd[1]: session-48.scope: Deactivated successfully. Feb 1 04:13:24 localhost systemd[1]: session-48.scope: Consumed 2.300s CPU time. Feb 1 04:13:24 localhost systemd-logind[759]: Session 48 logged out. Waiting for processes to exit. Feb 1 04:13:24 localhost systemd-logind[759]: Removed session 48. Feb 1 04:13:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49752 DF PROTO=TCP SPT=56680 DPT=9882 SEQ=1763380457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF376F80000000001030307) Feb 1 04:13:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49753 DF PROTO=TCP SPT=56680 DPT=9882 SEQ=1763380457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF37EF80000000001030307) Feb 1 04:13:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47757 DF PROTO=TCP SPT=57042 DPT=9101 SEQ=211093726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF38ACA0000000001030307) Feb 1 04:13:30 localhost sshd[145976]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:13:30 localhost systemd-logind[759]: New session 49 of user zuul. Feb 1 04:13:30 localhost systemd[1]: Started Session 49 of User zuul. Feb 1 04:13:31 localhost python3.9[146069]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:13:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47759 DF PROTO=TCP SPT=57042 DPT=9101 SEQ=211093726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF396B90000000001030307) Feb 1 04:13:33 localhost python3.9[146165]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:34 localhost python3.9[146257]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:13:35 localhost python3.9[146347]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:13:36 localhost python3.9[146439]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 1 04:13:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47760 DF PROTO=TCP SPT=57042 DPT=9101 SEQ=211093726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3A6780000000001030307) Feb 1 04:13:37 localhost python3.9[146531]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:13:38 localhost python3.9[146585]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:13:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62601 DF PROTO=TCP SPT=46144 DPT=9100 SEQ=1413544078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3ADB90000000001030307) Feb 1 04:13:42 localhost python3.9[146679]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:13:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63772 DF PROTO=TCP SPT=34652 DPT=9100 SEQ=1787363634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3C2790000000001030307) Feb 1 04:13:44 localhost python3[146774]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Feb 1 04:13:45 localhost python3.9[146866]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47761 DF PROTO=TCP SPT=57042 DPT=9101 SEQ=211093726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3C7B80000000001030307) Feb 1 04:13:46 localhost python3.9[146958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:46 localhost python3.9[147006]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:47 localhost python3.9[147098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:47 localhost python3.9[147146]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.8itkyj4x recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:48 localhost python3.9[147238]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:49 localhost python3.9[147286]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:50 localhost python3.9[147378]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:13:50 localhost python3[147472]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 1 04:13:51 localhost python3.9[147564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:52 localhost python3.9[147639]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937231.1084518-426-221722668926701/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38074 DF PROTO=TCP SPT=59746 DPT=9102 SEQ=3754063478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3E3B90000000001030307) Feb 1 04:13:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63773 DF PROTO=TCP SPT=34652 DPT=9100 SEQ=1787363634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3E3B90000000001030307) Feb 1 04:13:53 localhost python3.9[147731]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:53 localhost python3.9[147806]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937232.5175848-471-53477272349319/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:54 localhost python3.9[147898]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:54 localhost python3.9[147973]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937233.6916566-516-273449404193980/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6494 DF PROTO=TCP SPT=48460 DPT=9882 SEQ=1896283987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3EC380000000001030307) Feb 1 04:13:55 localhost python3.9[148065]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:56 localhost python3.9[148140]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937234.99688-561-90204252289899/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:56 localhost python3.9[148232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:13:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6495 DF PROTO=TCP SPT=48460 DPT=9882 SEQ=1896283987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3F4380000000001030307) Feb 1 04:13:57 localhost python3.9[148307]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937236.3603811-606-80082706354679/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:58 localhost python3.9[148429]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:13:58 localhost python3.9[148572]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:13:59 localhost python3.9[148700]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56920 DF PROTO=TCP SPT=35500 DPT=9101 SEQ=3852463002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF3FFFA0000000001030307) Feb 1 04:14:00 localhost python3.9[148807]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:14:01 localhost python3.9[148900]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:14:02 localhost python3.9[148994]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:14:02 localhost python3.9[149089]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11031 DF PROTO=TCP SPT=58370 DPT=9105 SEQ=2380777742 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF40BB80000000001030307) Feb 1 04:14:03 localhost python3.9[149179]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:14:05 localhost python3.9[149272]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005604212.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:99:13:90:9c" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:14:05 localhost ovs-vsctl[149273]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005604212.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:99:13:90:9c external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Feb 1 04:14:05 localhost python3.9[149365]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:14:06 localhost python3.9[149458]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:14:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56923 DF PROTO=TCP SPT=35500 DPT=9101 SEQ=3852463002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF41BB80000000001030307) Feb 1 04:14:07 localhost python3.9[149552]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:08 localhost python3.9[149644]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:08 localhost python3.9[149692]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38075 DF PROTO=TCP SPT=59746 DPT=9102 SEQ=3754063478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF423B80000000001030307) Feb 1 04:14:09 localhost python3.9[149784]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:09 localhost python3.9[149832]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:10 localhost python3.9[149924]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:11 localhost python3.9[150016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:12 localhost python3.9[150064]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:12 localhost python3.9[150156]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:13 localhost python3.9[150204]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:14 localhost python3.9[150296]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:14:14 localhost systemd[1]: Reloading. Feb 1 04:14:14 localhost systemd-sysv-generator[150325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:14:14 localhost systemd-rc-local-generator[150319]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:14:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:14:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61932 DF PROTO=TCP SPT=50056 DPT=9100 SEQ=2096641030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF437B80000000001030307) Feb 1 04:14:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56924 DF PROTO=TCP SPT=35500 DPT=9101 SEQ=3852463002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF43BB80000000001030307) Feb 1 04:14:16 localhost python3.9[150427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:16 localhost python3.9[150475]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:17 localhost python3.9[150567]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:17 localhost python3.9[150615]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:18 localhost python3.9[150707]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:14:18 localhost systemd[1]: Reloading. Feb 1 04:14:18 localhost systemd-rc-local-generator[150730]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:14:18 localhost systemd-sysv-generator[150735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:14:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:14:19 localhost systemd[1]: Starting Create netns directory... Feb 1 04:14:19 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 04:14:19 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 04:14:19 localhost systemd[1]: Finished Create netns directory. Feb 1 04:14:20 localhost python3.9[150841]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:21 localhost python3.9[150933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:22 localhost python3.9[151006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937260.9852045-1338-224916482620563/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61933 DF PROTO=TCP SPT=50056 DPT=9100 SEQ=2096641030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF457B80000000001030307) Feb 1 04:14:22 localhost python3.9[151098]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43333 DF PROTO=TCP SPT=56238 DPT=9102 SEQ=587303590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF459B80000000001030307) Feb 1 04:14:23 localhost python3.9[151190]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:24 localhost python3.9[151282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17114 DF PROTO=TCP SPT=57040 DPT=9882 SEQ=2519415293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF461780000000001030307) Feb 1 04:14:25 localhost python3.9[151357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937263.9036222-1437-233927085693832/.source.json _original_basename=.8wpu8gt9 follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:25 localhost python3.9[151447]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17115 DF PROTO=TCP SPT=57040 DPT=9882 SEQ=2519415293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF469780000000001030307) Feb 1 04:14:28 localhost python3.9[151700]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Feb 1 04:14:29 localhost python3.9[151792]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:14:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6413 DF PROTO=TCP SPT=39552 DPT=9101 SEQ=474302743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF475290000000001030307) Feb 1 04:14:30 localhost python3[151884]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:14:30 localhost python3[151884]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e",#012 "Digest": "sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:38:56.623500445Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 346422728,#012 "VirtualSize": 346422728,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:033e0289d512b27a678c3feb7195acb9c5f2fbb27c9b2d8c8b5b5f6156f0d11f",#012 "sha256:f848a534c5dfe59c31c3da34c3d2466bdea7e8da7def4225acdd3ffef1544d2f"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:55.650316471Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util- Feb 1 04:14:30 localhost podman[151934]: 2026-02-01 09:14:30.793468655 +0000 UTC m=+0.097633964 container remove f8afd85cf176f9ae7ba7db3ab056114f358228d2cc35995c26ccea14c5af3446 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_controller, release=1766032510) Feb 1 04:14:30 localhost python3[151884]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Feb 1 04:14:30 localhost podman[151948]: Feb 1 04:14:30 localhost podman[151948]: 2026-02-01 09:14:30.916158086 +0000 UTC m=+0.100054166 container create b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:14:30 localhost podman[151948]: 2026-02-01 09:14:30.876477987 +0000 UTC m=+0.060374127 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 1 04:14:30 localhost python3[151884]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 1 04:14:32 localhost python3.9[152077]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:14:32 localhost python3.9[152171]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6415 DF PROTO=TCP SPT=39552 DPT=9101 SEQ=474302743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF481390000000001030307) Feb 1 04:14:33 localhost python3.9[152217]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:14:33 localhost python3.9[152308]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937273.3294358-1671-243597383764306/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:34 localhost python3.9[152354]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:14:34 localhost systemd[1]: Reloading. Feb 1 04:14:34 localhost systemd-rc-local-generator[152376]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:14:34 localhost systemd-sysv-generator[152382]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:14:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:14:35 localhost python3.9[152436]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:14:36 localhost systemd[1]: Reloading. Feb 1 04:14:36 localhost systemd-sysv-generator[152466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:14:36 localhost systemd-rc-local-generator[152461]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:14:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:14:36 localhost systemd[1]: Starting ovn_controller container... Feb 1 04:14:36 localhost systemd[1]: tmp-crun.tiPzBC.mount: Deactivated successfully. Feb 1 04:14:37 localhost systemd[1]: Started libcrun container. Feb 1 04:14:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af21a57f7a43de755d21523b190f05a380f9dd0b8c798ed342b45df3083cdba9/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Feb 1 04:14:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:14:37 localhost podman[152478]: 2026-02-01 09:14:37.066400463 +0000 UTC m=+0.163461700 container init b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2) Feb 1 04:14:37 localhost ovn_controller[152492]: + sudo -E kolla_set_configs Feb 1 04:14:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:14:37 localhost podman[152478]: 2026-02-01 09:14:37.110434409 +0000 UTC m=+0.207495586 container start b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:14:37 localhost edpm-start-podman-container[152478]: ovn_controller Feb 1 04:14:37 localhost systemd[1]: Created slice User Slice of UID 0. Feb 1 04:14:37 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 1 04:14:37 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 1 04:14:37 localhost systemd[1]: Starting User Manager for UID 0... Feb 1 04:14:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6416 DF PROTO=TCP SPT=39552 DPT=9101 SEQ=474302743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF490F80000000001030307) Feb 1 04:14:37 localhost edpm-start-podman-container[152477]: Creating additional drop-in dependency for "ovn_controller" (b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb) Feb 1 04:14:37 localhost systemd[1]: Reloading. Feb 1 04:14:37 localhost podman[152500]: 2026-02-01 09:14:37.262598986 +0000 UTC m=+0.146977905 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:14:37 localhost podman[152500]: 2026-02-01 09:14:37.274102145 +0000 UTC m=+0.158481084 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:14:37 localhost podman[152500]: unhealthy Feb 1 04:14:37 localhost systemd[152525]: Queued start job for default target Main User Target. Feb 1 04:14:37 localhost systemd[152525]: Created slice User Application Slice. Feb 1 04:14:37 localhost systemd[152525]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 1 04:14:37 localhost systemd[152525]: Started Daily Cleanup of User's Temporary Directories. Feb 1 04:14:37 localhost systemd[152525]: Reached target Paths. Feb 1 04:14:37 localhost systemd[152525]: Reached target Timers. Feb 1 04:14:37 localhost systemd[152525]: Starting D-Bus User Message Bus Socket... Feb 1 04:14:37 localhost systemd[152525]: Starting Create User's Volatile Files and Directories... Feb 1 04:14:37 localhost systemd[152525]: Finished Create User's Volatile Files and Directories. Feb 1 04:14:37 localhost systemd[152525]: Listening on D-Bus User Message Bus Socket. Feb 1 04:14:37 localhost systemd[152525]: Reached target Sockets. Feb 1 04:14:37 localhost systemd[152525]: Reached target Basic System. Feb 1 04:14:37 localhost systemd[152525]: Reached target Main User Target. Feb 1 04:14:37 localhost systemd[152525]: Startup finished in 117ms. Feb 1 04:14:37 localhost systemd-sysv-generator[152586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:14:37 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 1 04:14:37 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:14:37 localhost systemd-rc-local-generator[152582]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:14:37 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:14:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:14:37 localhost systemd[1]: Started User Manager for UID 0. Feb 1 04:14:37 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:14:37 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Failed with result 'exit-code'. Feb 1 04:14:37 localhost systemd[1]: Started ovn_controller container. Feb 1 04:14:37 localhost systemd[1]: Started Session c12 of User root. Feb 1 04:14:37 localhost ovn_controller[152492]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:14:37 localhost ovn_controller[152492]: INFO:__main__:Validating config file Feb 1 04:14:37 localhost ovn_controller[152492]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:14:37 localhost ovn_controller[152492]: INFO:__main__:Writing out command to execute Feb 1 04:14:37 localhost systemd[1]: session-c12.scope: Deactivated successfully. Feb 1 04:14:37 localhost ovn_controller[152492]: ++ cat /run_command Feb 1 04:14:37 localhost ovn_controller[152492]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Feb 1 04:14:37 localhost ovn_controller[152492]: + ARGS= Feb 1 04:14:37 localhost ovn_controller[152492]: + sudo kolla_copy_cacerts Feb 1 04:14:37 localhost systemd[1]: Started Session c13 of User root. Feb 1 04:14:37 localhost systemd[1]: session-c13.scope: Deactivated successfully. Feb 1 04:14:37 localhost ovn_controller[152492]: + [[ ! -n '' ]] Feb 1 04:14:37 localhost ovn_controller[152492]: + . kolla_extend_start Feb 1 04:14:37 localhost ovn_controller[152492]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Feb 1 04:14:37 localhost ovn_controller[152492]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Feb 1 04:14:37 localhost ovn_controller[152492]: + umask 0022 Feb 1 04:14:37 localhost ovn_controller[152492]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8] Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00004|main|INFO|OVS IDL reconnected, force recompute. Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00011|features|INFO|OVS Feature: ct_zero_snat, state: supported Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00012|features|INFO|OVS Feature: ct_flush, state: supported Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00014|main|INFO|OVS feature set changed, force recompute. Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute. Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00020|binding|INFO|Claiming lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 for this chassis. Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00021|binding|INFO|09cac1be-46e2-4a31-8306-e6f4f0401b19: Claiming fa:16:3e:86:11:63 192.168.0.12 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00022|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-e1cc33-0 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-2186fb-0 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00025|ovn_bfd|INFO|Disabled BFD on interface ovn-45aa31-0 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00026|binding|INFO|Removing lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 ovn-installed in OVS Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00027|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00028|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00029|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-e1cc33-0 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-2186fb-0 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-45aa31-0 Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00033|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00034|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00035|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00036|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:37 localhost ovn_controller[152492]: 2026-02-01T09:14:37Z|00037|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:38 localhost python3.9[152693]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 1 04:14:38 localhost ovn_controller[152492]: 2026-02-01T09:14:38Z|00038|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:38 localhost ovn_controller[152492]: 2026-02-01T09:14:38Z|00039|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17117 DF PROTO=TCP SPT=57040 DPT=9882 SEQ=2519415293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF499B80000000001030307) Feb 1 04:14:39 localhost ovn_controller[152492]: 2026-02-01T09:14:39Z|00040|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:14:39 localhost python3.9[152785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:40 localhost python3.9[152858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937279.1230912-1806-103966440494248/.source.yaml _original_basename=.i3mluhlw follow=False checksum=4ef88525fff00a5112f620461f949f82fa85c4cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:14:40 localhost python3.9[152950]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:14:40 localhost ovs-vsctl[152951]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Feb 1 04:14:41 localhost python3.9[153043]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:14:41 localhost ovs-vsctl[153045]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Feb 1 04:14:42 localhost python3.9[153138]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:14:42 localhost ovs-vsctl[153139]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Feb 1 04:14:43 localhost systemd[1]: session-49.scope: Deactivated successfully. Feb 1 04:14:43 localhost systemd[1]: session-49.scope: Consumed 42.573s CPU time. Feb 1 04:14:43 localhost systemd-logind[759]: Session 49 logged out. Waiting for processes to exit. Feb 1 04:14:43 localhost systemd-logind[759]: Removed session 49. Feb 1 04:14:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32130 DF PROTO=TCP SPT=45216 DPT=9100 SEQ=51220461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4ACF80000000001030307) Feb 1 04:14:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6417 DF PROTO=TCP SPT=39552 DPT=9101 SEQ=474302743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4B1B80000000001030307) Feb 1 04:14:45 localhost ovn_controller[152492]: 2026-02-01T09:14:45Z|00041|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 ovn-installed in OVS Feb 1 04:14:45 localhost ovn_controller[152492]: 2026-02-01T09:14:45Z|00042|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 up in Southbound Feb 1 04:14:47 localhost systemd[1]: Stopping User Manager for UID 0... Feb 1 04:14:47 localhost systemd[152525]: Activating special unit Exit the Session... Feb 1 04:14:47 localhost systemd[152525]: Stopped target Main User Target. Feb 1 04:14:47 localhost systemd[152525]: Stopped target Basic System. Feb 1 04:14:47 localhost systemd[152525]: Stopped target Paths. Feb 1 04:14:47 localhost systemd[152525]: Stopped target Sockets. Feb 1 04:14:47 localhost systemd[152525]: Stopped target Timers. Feb 1 04:14:47 localhost systemd[152525]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 04:14:47 localhost systemd[152525]: Closed D-Bus User Message Bus Socket. Feb 1 04:14:47 localhost systemd[152525]: Stopped Create User's Volatile Files and Directories. Feb 1 04:14:47 localhost systemd[152525]: Removed slice User Application Slice. Feb 1 04:14:47 localhost systemd[152525]: Reached target Shutdown. Feb 1 04:14:47 localhost systemd[152525]: Finished Exit the Session. Feb 1 04:14:47 localhost systemd[152525]: Reached target Exit the Session. Feb 1 04:14:47 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 1 04:14:47 localhost systemd[1]: Stopped User Manager for UID 0. Feb 1 04:14:47 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 1 04:14:47 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 1 04:14:47 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 1 04:14:47 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 1 04:14:47 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 1 04:14:49 localhost sshd[153156]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:14:49 localhost systemd-logind[759]: New session 51 of user zuul. Feb 1 04:14:49 localhost systemd[1]: Started Session 51 of User zuul. Feb 1 04:14:50 localhost python3.9[153249]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:14:51 localhost python3.9[153345]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:52 localhost python3.9[153437]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:52 localhost python3.9[153529]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32131 DF PROTO=TCP SPT=45216 DPT=9100 SEQ=51220461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4CDB80000000001030307) Feb 1 04:14:52 localhost ovn_controller[152492]: 2026-02-01T09:14:52Z|00043|memory|INFO|17096 kB peak resident set size after 15.3 seconds Feb 1 04:14:52 localhost ovn_controller[152492]: 2026-02-01T09:14:52Z|00044|memory|INFO|idl-cells-OVN_Southbound:4033 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:289 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:154 ofctrl_installed_flow_usage-KB:112 ofctrl_sb_flow_ref_usage-KB:67 Feb 1 04:14:53 localhost python3.9[153621]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14677 DF PROTO=TCP SPT=49940 DPT=9102 SEQ=2280720414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4CFB80000000001030307) Feb 1 04:14:54 localhost python3.9[153713]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:54 localhost python3.9[153803]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:14:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27198 DF PROTO=TCP SPT=55136 DPT=9882 SEQ=467251626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4D6790000000001030307) Feb 1 04:14:56 localhost python3.9[153896]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 1 04:14:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27199 DF PROTO=TCP SPT=55136 DPT=9882 SEQ=467251626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4DE780000000001030307) Feb 1 04:14:57 localhost python3.9[153986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:58 localhost python3.9[154059]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937296.7240717-213-246801611338564/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:14:58 localhost python3.9[154149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:14:59 localhost python3.9[154222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937298.2653368-258-87980934841995/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1276 DF PROTO=TCP SPT=44160 DPT=9101 SEQ=3491645909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4EA5A0000000001030307) Feb 1 04:15:00 localhost python3.9[154328]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:15:01 localhost python3.9[154430]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:15:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1278 DF PROTO=TCP SPT=44160 DPT=9101 SEQ=3491645909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF4F6780000000001030307) Feb 1 04:15:05 localhost python3.9[154539]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:15:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1279 DF PROTO=TCP SPT=44160 DPT=9101 SEQ=3491645909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF506380000000001030307) Feb 1 04:15:07 localhost python3.9[154632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:15:07 localhost python3.9[154703]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937306.7690337-369-25809760504798/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:07 localhost podman[154704]: 2026-02-01 09:15:07.750583228 +0000 UTC m=+0.098890070 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:15:07 localhost podman[154704]: 2026-02-01 09:15:07.823698049 +0000 UTC m=+0.172004891 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:15:07 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:15:08 localhost python3.9[154821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:08 localhost python3.9[154892]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937307.8724313-369-190779313642993/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27201 DF PROTO=TCP SPT=55136 DPT=9882 SEQ=467251626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF50DB90000000001030307) Feb 1 04:15:10 localhost python3.9[154982]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:10 localhost python3.9[155053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937309.7270546-501-267006740502071/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:11 localhost python3.9[155143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:11 localhost python3.9[155214]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937310.8982446-501-6342684717682/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:12 localhost python3.9[155304]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:15:13 localhost python3.9[155398]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:14 localhost python3.9[155490]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16975 DF PROTO=TCP SPT=45264 DPT=9100 SEQ=81653509 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF522380000000001030307) Feb 1 04:15:14 localhost python3.9[155538]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1280 DF PROTO=TCP SPT=44160 DPT=9101 SEQ=3491645909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF525B80000000001030307) Feb 1 04:15:15 localhost python3.9[155630]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:15 localhost ovn_controller[152492]: 2026-02-01T09:15:15Z|00045|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory Feb 1 04:15:16 localhost python3.9[155678]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:17 localhost python3.9[155770]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:18 localhost python3.9[155862]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:18 localhost python3.9[155910]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:19 localhost python3.9[156002]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:19 localhost python3.9[156050]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:20 localhost python3.9[156142]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:15:20 localhost systemd[1]: Reloading. Feb 1 04:15:20 localhost systemd-sysv-generator[156172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:15:20 localhost systemd-rc-local-generator[156165]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:15:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:15:21 localhost python3.9[156272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:22 localhost python3.9[156320]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16976 DF PROTO=TCP SPT=45264 DPT=9100 SEQ=81653509 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF541B90000000001030307) Feb 1 04:15:22 localhost python3.9[156412]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28143 DF PROTO=TCP SPT=55462 DPT=9102 SEQ=3013316461 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF543B80000000001030307) Feb 1 04:15:23 localhost python3.9[156460]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:24 localhost python3.9[156552]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:15:24 localhost systemd[1]: Reloading. Feb 1 04:15:24 localhost systemd-sysv-generator[156581]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:15:24 localhost systemd-rc-local-generator[156577]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:15:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:15:24 localhost systemd[1]: Starting Create netns directory... Feb 1 04:15:24 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 04:15:24 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 04:15:24 localhost systemd[1]: Finished Create netns directory. Feb 1 04:15:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12226 DF PROTO=TCP SPT=60088 DPT=9882 SEQ=1014623894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF54BB90000000001030307) Feb 1 04:15:25 localhost python3.9[156686]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:26 localhost python3.9[156778]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:26 localhost python3.9[156851]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937325.589943-954-193074224984849/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12227 DF PROTO=TCP SPT=60088 DPT=9882 SEQ=1014623894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF553B80000000001030307) Feb 1 04:15:27 localhost python3.9[156943]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:28 localhost python3.9[157035]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:15:29 localhost python3.9[157127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:29 localhost python3.9[157202]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937328.5627398-1053-45882035524826/.source.json _original_basename=.h7aoah64 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57153 DF PROTO=TCP SPT=55260 DPT=9101 SEQ=495282810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF55F8A0000000001030307) Feb 1 04:15:30 localhost python3.9[157292]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:32 localhost python3.9[157545]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Feb 1 04:15:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57155 DF PROTO=TCP SPT=55260 DPT=9101 SEQ=495282810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF56B780000000001030307) Feb 1 04:15:33 localhost python3.9[157637]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:15:34 localhost python3[157729]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:15:34 localhost python3[157729]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8",#012 "Digest": "sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:29:34.446261637Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 785500417,#012 "VirtualSize": 785500417,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc/diff:/var/lib/containers/storage/overlay/33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:d3cc9cdab7e3e7c1a0a6c80e61bbd8cc5eeeba7069bab1cc064ed2e6cc28ed58",#012 "sha256:d5cbf3016eca6267717119e8ebab3c6c083cae6c589c6961ae23bfa93ef3afa4",#012 "sha256:0096ee5d07436ac5b94d9d58b8b2407cc5e6854d70de5e7f89b9a7a1ad4912ad"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.con Feb 1 04:15:35 localhost podman[157780]: 2026-02-01 09:15:35.094430958 +0000 UTC m=+0.090077074 container remove f3e307f3efbd7e64f16ce0f8d3e1d165869444eb8b8b5317972fa2daf58eb840 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '4dacb3799b36b0da29dc6587bf4940e2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=) Feb 1 04:15:35 localhost python3[157729]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Feb 1 04:15:35 localhost podman[157795]: Feb 1 04:15:35 localhost podman[157795]: 2026-02-01 09:15:35.196541032 +0000 UTC m=+0.083331925 container create 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:15:35 localhost podman[157795]: 2026-02-01 09:15:35.162083768 +0000 UTC m=+0.048874721 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 1 04:15:35 localhost python3[157729]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 1 04:15:35 localhost python3.9[157923]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:15:36 localhost python3.9[158017]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57156 DF PROTO=TCP SPT=55260 DPT=9101 SEQ=495282810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF57B380000000001030307) Feb 1 04:15:37 localhost python3.9[158063]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:15:37 localhost python3.9[158154]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937337.3815167-1287-277079215599211/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:15:38 localhost podman[158201]: 2026-02-01 09:15:38.298807448 +0000 UTC m=+0.080960172 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:15:38 localhost podman[158201]: 2026-02-01 09:15:38.337445392 +0000 UTC m=+0.119598086 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Feb 1 04:15:38 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:15:38 localhost python3.9[158200]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:15:38 localhost systemd[1]: Reloading. Feb 1 04:15:38 localhost systemd-rc-local-generator[158251]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:15:38 localhost systemd-sysv-generator[158254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:15:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12229 DF PROTO=TCP SPT=60088 DPT=9882 SEQ=1014623894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF583B80000000001030307) Feb 1 04:15:39 localhost python3.9[158305]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:15:39 localhost systemd[1]: Reloading. Feb 1 04:15:39 localhost systemd-sysv-generator[158333]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:15:39 localhost systemd-rc-local-generator[158328]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:15:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:15:39 localhost systemd[1]: Starting ovn_metadata_agent container... Feb 1 04:15:39 localhost systemd[1]: Started libcrun container. Feb 1 04:15:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8523d5595ba46410fb7425d6855c4f31f8acc052e6ac29ed76935ec089e7b13c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 1 04:15:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8523d5595ba46410fb7425d6855c4f31f8acc052e6ac29ed76935ec089e7b13c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:15:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:15:39 localhost podman[158346]: 2026-02-01 09:15:39.913636809 +0000 UTC m=+0.150740059 container init 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Feb 1 04:15:39 localhost ovn_metadata_agent[158360]: + sudo -E kolla_set_configs Feb 1 04:15:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:15:39 localhost podman[158346]: 2026-02-01 09:15:39.949194107 +0000 UTC m=+0.186297367 container start 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:15:39 localhost edpm-start-podman-container[158346]: ovn_metadata_agent Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Validating config file Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Copying service configuration files Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Writing out command to execute Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/8bdf8183-8467-40ac-933d-a37b0bd3539a.conf Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: ++ cat /run_command Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + CMD=neutron-ovn-metadata-agent Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + ARGS= Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + sudo kolla_copy_cacerts Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + [[ ! -n '' ]] Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + . kolla_extend_start Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: Running command: 'neutron-ovn-metadata-agent' Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + umask 0022 Feb 1 04:15:40 localhost ovn_metadata_agent[158360]: + exec neutron-ovn-metadata-agent Feb 1 04:15:40 localhost podman[158369]: 2026-02-01 09:15:40.02113006 +0000 UTC m=+0.068219739 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:15:40 localhost edpm-start-podman-container[158345]: Creating additional drop-in dependency for "ovn_metadata_agent" (728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0) Feb 1 04:15:40 localhost systemd[1]: Reloading. Feb 1 04:15:40 localhost podman[158369]: 2026-02-01 09:15:40.104420693 +0000 UTC m=+0.151510362 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent) Feb 1 04:15:40 localhost systemd-rc-local-generator[158434]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:15:40 localhost systemd-sysv-generator[158437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:15:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:15:40 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:15:40 localhost systemd[1]: Started ovn_metadata_agent container. Feb 1 04:15:41 localhost sshd[158460]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.634 158365 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.635 158365 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.635 158365 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.635 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.635 158365 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.635 158365 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.636 158365 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.637 158365 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.638 158365 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.639 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.640 158365 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.641 158365 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.642 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.643 158365 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.644 158365 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.645 158365 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.646 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.647 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.648 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.649 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.650 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.651 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.652 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.653 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.654 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.655 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.656 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.657 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.658 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.659 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.660 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.661 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.662 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.663 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.664 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.665 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.666 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.667 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.667 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.667 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.667 158365 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.667 158365 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.676 158365 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.676 158365 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.676 158365 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.676 158365 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.677 158365 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.693 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name e1d14e36-ae9d-43b6-8933-f137b54529ff (UUID: e1d14e36-ae9d-43b6-8933-f137b54529ff) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.715 158365 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.716 158365 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.716 158365 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.716 158365 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.718 158365 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.720 158365 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.737 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:11:63 192.168.0.12'], port_security=['fa:16:3e:86:11:63 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005604212.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '79df39cba1c14309b68e8b61518619fd', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0b065334-69c4-4862-ab2c-0676d50a1918 0dc57611-620a-4a91-b761-dd2b6dc1d570', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5329260b-b0db-417b-bda6-9045427ce15d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09cac1be-46e2-4a31-8306-e6f4f0401b19) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.738 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'e1d14e36-ae9d-43b6-8933-f137b54529ff'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': 'f813c136-8b86-53dc-a894-324de75198f6', 'neutron:ovn-metadata-sb-cfg': '1'}, name=e1d14e36-ae9d-43b6-8933-f137b54529ff, nb_cfg_timestamp=1769937286367, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.739 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 09cac1be-46e2-4a31-8306-e6f4f0401b19 in datapath 8bdf8183-8467-40ac-933d-a37b0bd3539a bound to our chassis on insert#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.740 158365 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.740 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.740 158365 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.740 158365 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.741 158365 INFO oslo_service.service [-] Starting 1 workers#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.743 158365 DEBUG oslo_service.service [-] Started child 158462 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.746 158365 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8bdf8183-8467-40ac-933d-a37b0bd3539a#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.747 158365 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp29n4n0pc/privsep.sock']#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.747 158462 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-240357'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.772 158462 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.773 158462 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.773 158462 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.776 158462 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.781 158462 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Feb 1 04:15:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:41.794 158462 INFO eventlet.wsgi.server [-] (158462) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.366 158365 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.368 158365 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp29n4n0pc/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.238 158526 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.242 158526 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.244 158526 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.244 158526 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158526#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.371 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[fbaa87d2-ebc9-45ad-bcfb-b9a87e70b8b2]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:42 localhost python3.9[158543]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.849 158526 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.849 158526 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:15:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:42.849 158526 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.380 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[aa84c1ea-a8e1-4eaf-b61b-9e4445ba744f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.382 158365 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp98sgxfur/privsep.sock']#033[00m Feb 1 04:15:43 localhost python3.9[158641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.959 158365 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.960 158365 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp98sgxfur/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.861 158695 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.867 158695 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.871 158695 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.871 158695 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158695#033[00m Feb 1 04:15:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:43.961 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[b92d54f0-2e2b-421e-8083-a7bf033899dd]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:44 localhost python3.9[158721]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937343.1473444-1422-79191921248850/.source.yaml _original_basename=.22h1qfic follow=False checksum=08b98aaf8b4739d4298bc1690447f4cee3a9ba74 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:15:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20571 DF PROTO=TCP SPT=53174 DPT=9100 SEQ=125443945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF597380000000001030307) Feb 1 04:15:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:44.466 158695 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:15:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:44.466 158695 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:15:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:44.466 158695 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:15:44 localhost systemd[1]: session-51.scope: Deactivated successfully. Feb 1 04:15:44 localhost systemd[1]: session-51.scope: Consumed 33.624s CPU time. Feb 1 04:15:44 localhost systemd-logind[759]: Session 51 logged out. Waiting for processes to exit. Feb 1 04:15:44 localhost systemd-logind[759]: Removed session 51. Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.002 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[72cd8c9a-ee91-42f3-ae95-6c52a2485c24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.005 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[32c8b89b-2da3-4f34-9372-545cc3dcd2f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.030 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[f9168468-695e-4db0-a0fe-89e656d77c63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.047 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ec81525f-6ae0-4b40-afb6-f5bc5ab07995]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8bdf8183-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:29:7a:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635606, 'reachable_time': 36796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 158745, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.066 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[1439aae2-67dc-4390-84cd-e50ec18c3887]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8bdf8183-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635617, 'tstamp': 635617}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158746, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap8bdf8183-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635619, 'tstamp': 635619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158746, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635614, 'tstamp': 635614}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158746, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:7aac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635606, 'tstamp': 635606}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 158746, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.128 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[e36938ca-3404-4eb2-ab54-debe0fd93af1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.130 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bdf8183-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.176 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bdf8183-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.176 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.177 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8bdf8183-80, col_values=(('external_ids', {'iface-id': 'a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.177 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.181 158365 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp2u7chc7x/privsep.sock']#033[00m Feb 1 04:15:45 localhost sshd[158754]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:15:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57157 DF PROTO=TCP SPT=55260 DPT=9101 SEQ=495282810 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF59BB80000000001030307) Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.790 158365 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.791 158365 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp2u7chc7x/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.676 158757 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.679 158757 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.682 158757 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.682 158757 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158757#033[00m Feb 1 04:15:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:45.794 158757 DEBUG oslo.privsep.daemon [-] privsep: reply[18e03198-24a0-4e84-b46f-1d9d1f5e5254]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.213 158757 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.213 158757 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.213 158757 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.714 158757 DEBUG oslo.privsep.daemon [-] privsep: reply[8312fd6f-b3cf-46ce-ad16-6f4001968b36]: (4, ['ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.717 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, column=external_ids, values=({'neutron:ovn-metadata-id': 'f813c136-8b86-53dc-a894-324de75198f6'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.717 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.718 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.730 158365 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.730 158365 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.730 158365 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.730 158365 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.730 158365 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.731 158365 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.731 158365 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.731 158365 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.731 158365 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.731 158365 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.731 158365 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.732 158365 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.732 158365 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.732 158365 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.732 158365 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.732 158365 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.732 158365 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.733 158365 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.734 158365 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.734 158365 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.734 158365 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.735 158365 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.735 158365 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.736 158365 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.736 158365 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.736 158365 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.736 158365 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.736 158365 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.737 158365 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.737 158365 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.737 158365 DEBUG oslo_service.service [-] host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.738 158365 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.738 158365 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.738 158365 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.739 158365 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.739 158365 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.739 158365 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.739 158365 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.740 158365 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.740 158365 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.740 158365 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.740 158365 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.741 158365 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.741 158365 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.741 158365 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.741 158365 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.742 158365 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.742 158365 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.742 158365 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.742 158365 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.743 158365 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.743 158365 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.743 158365 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.743 158365 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.744 158365 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.744 158365 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.744 158365 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.744 158365 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.745 158365 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.745 158365 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.745 158365 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.745 158365 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.745 158365 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.746 158365 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.746 158365 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.746 158365 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.746 158365 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.747 158365 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.747 158365 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.747 158365 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.747 158365 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.748 158365 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.748 158365 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.748 158365 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.748 158365 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.749 158365 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.749 158365 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.749 158365 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.749 158365 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.749 158365 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.750 158365 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.750 158365 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.750 158365 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.750 158365 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.751 158365 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.751 158365 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.751 158365 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.751 158365 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.752 158365 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.752 158365 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.752 158365 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.752 158365 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.752 158365 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.753 158365 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.753 158365 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.753 158365 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.753 158365 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.753 158365 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.754 158365 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.754 158365 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.754 158365 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.755 158365 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.755 158365 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.755 158365 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.755 158365 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.756 158365 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.756 158365 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.756 158365 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.756 158365 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.756 158365 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.757 158365 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.757 158365 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.757 158365 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.757 158365 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.758 158365 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.758 158365 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.758 158365 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.759 158365 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.759 158365 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.759 158365 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.759 158365 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.759 158365 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.760 158365 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.760 158365 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.760 158365 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.760 158365 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.761 158365 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.761 158365 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.761 158365 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.761 158365 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.762 158365 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.762 158365 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.762 158365 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.762 158365 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.762 158365 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.763 158365 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.763 158365 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.763 158365 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.763 158365 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.763 158365 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.763 158365 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.763 158365 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.764 158365 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.764 158365 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.764 158365 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.764 158365 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.764 158365 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.764 158365 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.764 158365 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.765 158365 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.766 158365 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.766 158365 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.766 158365 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.766 158365 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.766 158365 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.766 158365 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.767 158365 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.767 158365 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.767 158365 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.767 158365 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.767 158365 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.767 158365 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.767 158365 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.768 158365 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.768 158365 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.768 158365 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.768 158365 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.768 158365 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.768 158365 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.768 158365 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.769 158365 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.769 158365 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.769 158365 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.769 158365 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.769 158365 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.769 158365 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.769 158365 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.770 158365 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.770 158365 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.770 158365 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.770 158365 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.770 158365 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.770 158365 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.770 158365 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.771 158365 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.771 158365 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.771 158365 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.771 158365 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.771 158365 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.771 158365 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.772 158365 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.772 158365 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.772 158365 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.772 158365 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.772 158365 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.772 158365 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.772 158365 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.773 158365 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.773 158365 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.773 158365 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.773 158365 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.773 158365 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.773 158365 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.774 158365 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.774 158365 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.774 158365 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.774 158365 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.774 158365 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.774 158365 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.774 158365 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.775 158365 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.775 158365 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.775 158365 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.775 158365 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.775 158365 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.775 158365 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.775 158365 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.776 158365 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.776 158365 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.776 158365 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.776 158365 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.776 158365 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.776 158365 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.776 158365 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.777 158365 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.777 158365 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.777 158365 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.777 158365 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.777 158365 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.777 158365 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.777 158365 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.778 158365 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.778 158365 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.778 158365 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.778 158365 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.778 158365 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.778 158365 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.778 158365 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.779 158365 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.780 158365 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.780 158365 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.780 158365 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.780 158365 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.780 158365 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.780 158365 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.780 158365 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.781 158365 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.781 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.781 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.781 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.781 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.781 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.782 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.782 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.782 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.782 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.782 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.782 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.782 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.783 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.783 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.783 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.783 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.783 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.784 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.784 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.784 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.784 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.784 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.785 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.785 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.785 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.785 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.785 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.785 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.785 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.786 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.786 158365 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.786 158365 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.786 158365 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.786 158365 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.786 158365 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:15:46 localhost ovn_metadata_agent[158360]: 2026-02-01 09:15:46.787 158365 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 1 04:15:50 localhost sshd[158762]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:15:50 localhost systemd-logind[759]: New session 52 of user zuul. Feb 1 04:15:50 localhost systemd[1]: Started Session 52 of User zuul. Feb 1 04:15:51 localhost python3.9[158855]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:15:52 localhost python3.9[158951]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:15:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20572 DF PROTO=TCP SPT=53174 DPT=9100 SEQ=125443945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5B7B90000000001030307) Feb 1 04:15:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20891 DF PROTO=TCP SPT=52278 DPT=9102 SEQ=1032506857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5B9B80000000001030307) Feb 1 04:15:53 localhost python3.9[159056]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:15:53 localhost systemd[1]: libpod-723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5.scope: Deactivated successfully. Feb 1 04:15:53 localhost podman[159057]: 2026-02-01 09:15:53.358094982 +0000 UTC m=+0.073240894 container died 723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com) Feb 1 04:15:53 localhost podman[159057]: 2026-02-01 09:15:53.387030397 +0000 UTC m=+0.102176289 container cleanup 723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, vendor=Red Hat, Inc.) Feb 1 04:15:53 localhost podman[159071]: 2026-02-01 09:15:53.439724215 +0000 UTC m=+0.070705627 container remove 723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:15:53 localhost systemd[1]: libpod-conmon-723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5.scope: Deactivated successfully. Feb 1 04:15:54 localhost systemd[1]: var-lib-containers-storage-overlay-e666fbadd6c4e4928f102dd6c78271a042a375d2964a4c27ffb1b4262f6cceee-merged.mount: Deactivated successfully. Feb 1 04:15:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-723b57aeaa5abb5c1148a1b619a21349e3542945c46fe0ff4c7aaf8b231539d5-userdata-shm.mount: Deactivated successfully. Feb 1 04:15:54 localhost python3.9[159177]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:15:54 localhost systemd[1]: Reloading. Feb 1 04:15:54 localhost systemd-sysv-generator[159208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:15:54 localhost systemd-rc-local-generator[159205]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:15:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10617 DF PROTO=TCP SPT=34954 DPT=9882 SEQ=3855571184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5C0F80000000001030307) Feb 1 04:15:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:15:56 localhost python3.9[159303]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:15:56 localhost network[159320]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:15:56 localhost network[159321]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:15:56 localhost network[159322]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:15:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10618 DF PROTO=TCP SPT=34954 DPT=9882 SEQ=3855571184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5C8F80000000001030307) Feb 1 04:15:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:16:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63929 DF PROTO=TCP SPT=33968 DPT=9101 SEQ=1871432163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5D4BB0000000001030307) Feb 1 04:16:01 localhost python3.9[159553]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:16:02 localhost systemd[1]: Reloading. Feb 1 04:16:02 localhost systemd-rc-local-generator[159608]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:16:02 localhost systemd-sysv-generator[159612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:16:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:16:02 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Feb 1 04:16:02 localhost podman[159697]: 2026-02-01 09:16:02.664294705 +0000 UTC m=+0.100880638 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, version=7, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:16:02 localhost podman[159697]: 2026-02-01 09:16:02.786881253 +0000 UTC m=+0.223467166 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109) Feb 1 04:16:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63931 DF PROTO=TCP SPT=33968 DPT=9101 SEQ=1871432163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5E0B90000000001030307) Feb 1 04:16:03 localhost python3.9[159796]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:16:03 localhost python3.9[159957]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:16:04 localhost python3.9[160090]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:16:05 localhost python3.9[160183]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:16:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63932 DF PROTO=TCP SPT=33968 DPT=9101 SEQ=1871432163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5F0780000000001030307) Feb 1 04:16:07 localhost python3.9[160276]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:16:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:16:08 localhost podman[160278]: 2026-02-01 09:16:08.748390776 +0000 UTC m=+0.099694101 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true) Feb 1 04:16:08 localhost podman[160278]: 2026-02-01 09:16:08.819477713 +0000 UTC m=+0.170781018 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:16:08 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:16:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20573 DF PROTO=TCP SPT=53174 DPT=9100 SEQ=125443945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF5F7B80000000001030307) Feb 1 04:16:09 localhost python3.9[160396]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:16:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:16:10 localhost systemd[1]: tmp-crun.ZdnYme.mount: Deactivated successfully. Feb 1 04:16:10 localhost podman[160489]: 2026-02-01 09:16:10.580616384 +0000 UTC m=+0.093175610 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:16:10 localhost podman[160489]: 2026-02-01 09:16:10.585091662 +0000 UTC m=+0.097650938 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:16:10 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:16:10 localhost python3.9[160490]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:11 localhost python3.9[160598]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:12 localhost python3.9[160690]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:12 localhost python3.9[160782]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:13 localhost python3.9[160874]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:13 localhost python3.9[160966]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41573 DF PROTO=TCP SPT=57466 DPT=9100 SEQ=3941615989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF60C790000000001030307) Feb 1 04:16:14 localhost python3.9[161058]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:15 localhost python3.9[161150]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63933 DF PROTO=TCP SPT=33968 DPT=9101 SEQ=1871432163 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF611B80000000001030307) Feb 1 04:16:16 localhost python3.9[161242]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:16 localhost python3.9[161334]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:17 localhost python3.9[161426]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:17 localhost python3.9[161518]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:18 localhost python3.9[161610]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:19 localhost python3.9[161702]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:16:20 localhost python3.9[161794]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:21 localhost python3.9[161886]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 1 04:16:22 localhost python3.9[161978]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:16:22 localhost systemd[1]: Reloading. Feb 1 04:16:22 localhost systemd-rc-local-generator[162001]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:16:22 localhost systemd-sysv-generator[162005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:16:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:16:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41574 DF PROTO=TCP SPT=57466 DPT=9100 SEQ=3941615989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF62DB80000000001030307) Feb 1 04:16:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60373 DF PROTO=TCP SPT=37086 DPT=9102 SEQ=1362760145 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF62DB80000000001030307) Feb 1 04:16:23 localhost python3.9[162106]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:23 localhost python3.9[162199]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:24 localhost python3.9[162292]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30150 DF PROTO=TCP SPT=38748 DPT=9882 SEQ=3063617543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF636380000000001030307) Feb 1 04:16:25 localhost python3.9[162385]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:25 localhost python3.9[162478]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:26 localhost python3.9[162571]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:26 localhost python3.9[162664]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:16:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30151 DF PROTO=TCP SPT=38748 DPT=9882 SEQ=3063617543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF63E390000000001030307) Feb 1 04:16:28 localhost python3.9[162757]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Feb 1 04:16:29 localhost python3.9[162850]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 1 04:16:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63672 DF PROTO=TCP SPT=48922 DPT=9101 SEQ=829944597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF649EA0000000001030307) Feb 1 04:16:30 localhost python3.9[162948]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604212.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Feb 1 04:16:31 localhost python3.9[163048]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:16:32 localhost python3.9[163102]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:16:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43529 DF PROTO=TCP SPT=39376 DPT=9105 SEQ=2743807778 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF655B80000000001030307) Feb 1 04:16:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63675 DF PROTO=TCP SPT=48922 DPT=9101 SEQ=829944597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF665B80000000001030307) Feb 1 04:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41575 DF PROTO=TCP SPT=57466 DPT=9100 SEQ=3941615989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF66DB80000000001030307) Feb 1 04:16:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:16:39 localhost podman[163169]: 2026-02-01 09:16:39.752941182 +0000 UTC m=+0.103889021 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:16:39 localhost podman[163169]: 2026-02-01 09:16:39.799439323 +0000 UTC m=+0.150387202 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:16:39 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:16:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:16:40 localhost podman[163195]: 2026-02-01 09:16:40.723900565 +0000 UTC m=+0.084223745 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 1 04:16:40 localhost podman[163195]: 2026-02-01 09:16:40.760487852 +0000 UTC m=+0.120811052 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:16:40 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:16:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:16:41.669 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:16:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:16:41.670 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:16:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:16:41.671 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:16:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=830 DF PROTO=TCP SPT=59456 DPT=9100 SEQ=3140209780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF681B80000000001030307) Feb 1 04:16:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63676 DF PROTO=TCP SPT=48922 DPT=9101 SEQ=829944597 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF685B80000000001030307) Feb 1 04:16:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=831 DF PROTO=TCP SPT=59456 DPT=9100 SEQ=3140209780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6A1B90000000001030307) Feb 1 04:16:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8433 DF PROTO=TCP SPT=43404 DPT=9102 SEQ=2999387539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6A3B80000000001030307) Feb 1 04:16:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63226 DF PROTO=TCP SPT=42104 DPT=9882 SEQ=1872651930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6AB390000000001030307) Feb 1 04:16:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63227 DF PROTO=TCP SPT=42104 DPT=9882 SEQ=1872651930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6B3380000000001030307) Feb 1 04:17:00 localhost kernel: SELinux: Converting 2760 SID table entries... Feb 1 04:17:00 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Feb 1 04:17:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29039 DF PROTO=TCP SPT=45182 DPT=9101 SEQ=3364464266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6BF1A0000000001030307) Feb 1 04:17:00 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 04:17:00 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 04:17:00 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 04:17:00 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 04:17:00 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 04:17:00 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 04:17:00 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 04:17:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29041 DF PROTO=TCP SPT=45182 DPT=9101 SEQ=3364464266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6CB390000000001030307) Feb 1 04:17:04 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=19 res=1 Feb 1 04:17:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29042 DF PROTO=TCP SPT=45182 DPT=9101 SEQ=3364464266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6DAF90000000001030307) Feb 1 04:17:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8434 DF PROTO=TCP SPT=43404 DPT=9102 SEQ=2999387539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6E3B80000000001030307) Feb 1 04:17:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:17:10 localhost podman[164373]: 2026-02-01 09:17:10.77608049 +0000 UTC m=+0.096115801 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:17:10 localhost podman[164373]: 2026-02-01 09:17:10.819775636 +0000 UTC m=+0.139810937 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller) Feb 1 04:17:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:17:10 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:17:10 localhost podman[164399]: 2026-02-01 09:17:10.930663651 +0000 UTC m=+0.082141661 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:17:10 localhost podman[164399]: 2026-02-01 09:17:10.935976764 +0000 UTC m=+0.087454834 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Feb 1 04:17:10 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:17:11 localhost kernel: SELinux: Converting 2763 SID table entries... Feb 1 04:17:11 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 04:17:11 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 04:17:11 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 04:17:11 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 04:17:11 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 04:17:11 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 04:17:11 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 04:17:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41299 DF PROTO=TCP SPT=59938 DPT=9100 SEQ=2420270450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6F6F80000000001030307) Feb 1 04:17:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29043 DF PROTO=TCP SPT=45182 DPT=9101 SEQ=3364464266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF6FBB80000000001030307) Feb 1 04:17:22 localhost kernel: SELinux: Converting 2766 SID table entries... Feb 1 04:17:22 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 04:17:22 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 04:17:22 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 04:17:22 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 04:17:22 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 04:17:22 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 04:17:22 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 04:17:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41300 DF PROTO=TCP SPT=59938 DPT=9100 SEQ=2420270450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF717B80000000001030307) Feb 1 04:17:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59442 DF PROTO=TCP SPT=50752 DPT=9102 SEQ=1236962407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF719B80000000001030307) Feb 1 04:17:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50645 DF PROTO=TCP SPT=39770 DPT=9882 SEQ=2928801508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF720780000000001030307) Feb 1 04:17:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50646 DF PROTO=TCP SPT=39770 DPT=9882 SEQ=2928801508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF728780000000001030307) Feb 1 04:17:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45585 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=2089525365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF7344A0000000001030307) Feb 1 04:17:30 localhost kernel: SELinux: Converting 2766 SID table entries... Feb 1 04:17:30 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 04:17:30 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 04:17:30 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 04:17:30 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 04:17:30 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 04:17:30 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 04:17:30 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 04:17:31 localhost systemd[1]: Reloading. Feb 1 04:17:31 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=22 res=1 Feb 1 04:17:31 localhost systemd-rc-local-generator[164471]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:17:31 localhost systemd-sysv-generator[164476]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:17:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:17:31 localhost systemd[1]: Reloading. Feb 1 04:17:31 localhost systemd-rc-local-generator[164512]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:17:31 localhost systemd-sysv-generator[164516]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:17:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:17:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45587 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=2089525365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF740380000000001030307) Feb 1 04:17:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45588 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=2089525365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF74FF90000000001030307) Feb 1 04:17:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50648 DF PROTO=TCP SPT=39770 DPT=9882 SEQ=2928801508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF757B80000000001030307) Feb 1 04:17:40 localhost kernel: SELinux: Converting 2767 SID table entries... Feb 1 04:17:40 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 1 04:17:40 localhost kernel: SELinux: policy capability open_perms=1 Feb 1 04:17:40 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 1 04:17:40 localhost kernel: SELinux: policy capability always_check_network=0 Feb 1 04:17:40 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 1 04:17:40 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 1 04:17:40 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 1 04:17:41 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=23 res=1 Feb 1 04:17:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:17:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:17:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:17:41.671 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:17:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:17:41.672 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:17:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:17:41.673 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:17:41 localhost podman[164537]: 2026-02-01 09:17:41.743292046 +0000 UTC m=+0.088754793 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3) Feb 1 04:17:41 localhost podman[164537]: 2026-02-01 09:17:41.784367136 +0000 UTC m=+0.129829923 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller) Feb 1 04:17:41 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:17:41 localhost podman[164535]: 2026-02-01 09:17:41.794068503 +0000 UTC m=+0.139461848 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 1 04:17:41 localhost podman[164535]: 2026-02-01 09:17:41.877414099 +0000 UTC m=+0.222807414 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:17:41 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:17:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54438 DF PROTO=TCP SPT=48166 DPT=9100 SEQ=1224339196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF76BF90000000001030307) Feb 1 04:17:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45589 DF PROTO=TCP SPT=33646 DPT=9101 SEQ=2089525365 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF76FB80000000001030307) Feb 1 04:17:45 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 1 04:17:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54439 DF PROTO=TCP SPT=48166 DPT=9100 SEQ=1224339196 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF78BB80000000001030307) Feb 1 04:17:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35747 DF PROTO=TCP SPT=55584 DPT=9102 SEQ=3998036359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF78DB80000000001030307) Feb 1 04:17:53 localhost sshd[164810]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:17:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12073 DF PROTO=TCP SPT=35062 DPT=9882 SEQ=1801263660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF795B90000000001030307) Feb 1 04:17:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12074 DF PROTO=TCP SPT=35062 DPT=9882 SEQ=1801263660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF79DB80000000001030307) Feb 1 04:18:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31768 DF PROTO=TCP SPT=35208 DPT=9101 SEQ=678462474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF7A97A0000000001030307) Feb 1 04:18:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31770 DF PROTO=TCP SPT=35208 DPT=9101 SEQ=678462474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF7B5780000000001030307) Feb 1 04:18:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31771 DF PROTO=TCP SPT=35208 DPT=9101 SEQ=678462474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF7C5380000000001030307) Feb 1 04:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12076 DF PROTO=TCP SPT=35062 DPT=9882 SEQ=1801263660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF7CDB90000000001030307) Feb 1 04:18:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:18:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:18:13 localhost podman[174832]: 2026-02-01 09:18:13.466709126 +0000 UTC m=+0.288321643 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:18:13 localhost podman[174828]: 2026-02-01 09:18:13.544460341 +0000 UTC m=+0.366667246 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:18:13 localhost podman[174828]: 2026-02-01 09:18:13.553330143 +0000 UTC m=+0.375537058 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 1 04:18:13 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:18:13 localhost podman[174832]: 2026-02-01 09:18:13.57835325 +0000 UTC m=+0.399965807 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:18:13 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:18:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57626 DF PROTO=TCP SPT=56156 DPT=9100 SEQ=773618094 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF7E1390000000001030307) Feb 1 04:18:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31772 DF PROTO=TCP SPT=35208 DPT=9101 SEQ=678462474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF7E5B90000000001030307) Feb 1 04:18:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57627 DF PROTO=TCP SPT=56156 DPT=9100 SEQ=773618094 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF801B80000000001030307) Feb 1 04:18:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64026 DF PROTO=TCP SPT=54372 DPT=9102 SEQ=3002081854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF803B80000000001030307) Feb 1 04:18:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42289 DF PROTO=TCP SPT=35384 DPT=9882 SEQ=2855402570 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF80AF80000000001030307) Feb 1 04:18:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42290 DF PROTO=TCP SPT=35384 DPT=9882 SEQ=2855402570 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF812F80000000001030307) Feb 1 04:18:28 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 1 04:18:28 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 1 04:18:28 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 1 04:18:28 localhost systemd[1]: sshd.service: Consumed 1.315s CPU time, read 32.0K from disk, written 0B to disk. Feb 1 04:18:28 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 1 04:18:28 localhost systemd[1]: Stopping sshd-keygen.target... Feb 1 04:18:28 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:18:28 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:18:28 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 1 04:18:28 localhost systemd[1]: Reached target sshd-keygen.target. Feb 1 04:18:28 localhost systemd[1]: Starting OpenSSH server daemon... Feb 1 04:18:28 localhost sshd[182729]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:18:28 localhost systemd[1]: Started OpenSSH server daemon. Feb 1 04:18:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:28 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48969 DF PROTO=TCP SPT=35880 DPT=9101 SEQ=552616442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF81EAA0000000001030307) Feb 1 04:18:30 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 04:18:30 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 04:18:30 localhost systemd[1]: Reloading. Feb 1 04:18:30 localhost systemd-rc-local-generator[183062]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:30 localhost systemd-sysv-generator[183071]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:30 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 04:18:31 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 04:18:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48971 DF PROTO=TCP SPT=35880 DPT=9101 SEQ=552616442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF82AB80000000001030307) Feb 1 04:18:34 localhost python3.9[187780]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:18:35 localhost systemd[1]: Reloading. Feb 1 04:18:35 localhost systemd-rc-local-generator[188222]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:35 localhost systemd-sysv-generator[188225]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost python3.9[188892]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:18:36 localhost systemd[1]: Reloading. Feb 1 04:18:36 localhost systemd-sysv-generator[189004]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:36 localhost systemd-rc-local-generator[189001]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:36 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48972 DF PROTO=TCP SPT=35880 DPT=9101 SEQ=552616442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF83A780000000001030307) Feb 1 04:18:37 localhost python3.9[189426]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:18:37 localhost systemd[1]: Reloading. Feb 1 04:18:37 localhost systemd-rc-local-generator[189666]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:37 localhost systemd-sysv-generator[189671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost python3.9[190120]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:18:38 localhost systemd[1]: Reloading. Feb 1 04:18:38 localhost systemd-rc-local-generator[190344]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:38 localhost systemd-sysv-generator[190349]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59333 DF PROTO=TCP SPT=36506 DPT=9105 SEQ=996963787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8417B0000000001030307) Feb 1 04:18:39 localhost sshd[190627]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:18:41 localhost python3.9[191611]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:18:41.674 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:18:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:18:41.675 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:18:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:18:41.677 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:18:41 localhost systemd[1]: Reloading. Feb 1 04:18:41 localhost systemd-rc-local-generator[191820]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:41 localhost systemd-sysv-generator[191825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:41 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:42 localhost python3.9[192252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:42 localhost systemd[1]: Reloading. Feb 1 04:18:42 localhost systemd-rc-local-generator[192419]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:42 localhost systemd-sysv-generator[192424]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:43 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 04:18:43 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 04:18:43 localhost systemd[1]: man-db-cache-update.service: Consumed 14.961s CPU time. Feb 1 04:18:43 localhost systemd[1]: run-r0ba6079c11354d75aabf775dd2ab1eba.service: Deactivated successfully. Feb 1 04:18:43 localhost systemd[1]: run-rec63ad478d3b49b6a573a93591739ebd.service: Deactivated successfully. Feb 1 04:18:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:18:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:18:43 localhost podman[192546]: 2026-02-01 09:18:43.734460616 +0000 UTC m=+0.088011131 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:18:43 localhost podman[192546]: 2026-02-01 09:18:43.740543795 +0000 UTC m=+0.094094300 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:18:43 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:18:43 localhost podman[192547]: 2026-02-01 09:18:43.787285234 +0000 UTC m=+0.138819352 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible) Feb 1 04:18:43 localhost podman[192547]: 2026-02-01 09:18:43.826502597 +0000 UTC m=+0.178036685 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Feb 1 04:18:43 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:18:43 localhost python3.9[192558]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:44 localhost systemd[1]: Reloading. Feb 1 04:18:44 localhost systemd-rc-local-generator[192618]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:44 localhost systemd-sysv-generator[192622]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58937 DF PROTO=TCP SPT=44910 DPT=9100 SEQ=1868905718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF856780000000001030307) Feb 1 04:18:45 localhost python3.9[192740]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48973 DF PROTO=TCP SPT=35880 DPT=9101 SEQ=552616442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF85BB80000000001030307) Feb 1 04:18:45 localhost python3.9[192853]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:46 localhost systemd[1]: Reloading. Feb 1 04:18:46 localhost systemd-rc-local-generator[192880]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:46 localhost systemd-sysv-generator[192885]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost python3.9[193002]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:18:49 localhost systemd[1]: Reloading. Feb 1 04:18:49 localhost systemd-rc-local-generator[193031]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:18:49 localhost systemd-sysv-generator[193037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:18:50 localhost python3.9[193151]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:50 localhost python3.9[193264]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:51 localhost python3.9[193377]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58938 DF PROTO=TCP SPT=44910 DPT=9100 SEQ=1868905718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF877B80000000001030307) Feb 1 04:18:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5368 DF PROTO=TCP SPT=51072 DPT=9102 SEQ=639015372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF877B80000000001030307) Feb 1 04:18:52 localhost python3.9[193490]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:53 localhost python3.9[193603]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:54 localhost python3.9[193716]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64165 DF PROTO=TCP SPT=47656 DPT=9882 SEQ=1542806219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF87FF80000000001030307) Feb 1 04:18:55 localhost python3.9[193829]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:56 localhost python3.9[193942]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64166 DF PROTO=TCP SPT=47656 DPT=9882 SEQ=1542806219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF888210000000001030307) Feb 1 04:18:57 localhost python3.9[194055]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:57 localhost python3.9[194168]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:58 localhost python3.9[194281]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:18:59 localhost python3.9[194394]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:19:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63855 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=3109611722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF893DB0000000001030307) Feb 1 04:19:00 localhost python3.9[194507]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:19:01 localhost python3.9[194620]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 1 04:19:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63857 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=3109611722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF89FF80000000001030307) Feb 1 04:19:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63858 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=3109611722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8AFB80000000001030307) Feb 1 04:19:07 localhost python3.9[194733]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:19:07 localhost python3.9[194879]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:19:08 localhost python3.9[195008]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58939 DF PROTO=TCP SPT=44910 DPT=9100 SEQ=1868905718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8B7B80000000001030307) Feb 1 04:19:09 localhost python3.9[195148]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:19:10 localhost python3.9[195258]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:19:10 localhost python3.9[195368]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:19:11 localhost python3.9[195476]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:19:12 localhost python3.9[195586]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:13 localhost python3.9[195676]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937551.9609752-1662-200344834382399/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:19:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:19:13 localhost podman[195787]: 2026-02-01 09:19:13.970428979 +0000 UTC m=+0.120634517 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:19:14 localhost podman[195787]: 2026-02-01 09:19:14.00345813 +0000 UTC m=+0.153663688 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:19:14 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:19:14 localhost python3.9[195786]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:14 localhost podman[195804]: 2026-02-01 09:19:14.056520695 +0000 UTC m=+0.084871188 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:19:14 localhost podman[195804]: 2026-02-01 09:19:14.09396436 +0000 UTC m=+0.122314813 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller) Feb 1 04:19:14 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:19:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26022 DF PROTO=TCP SPT=42676 DPT=9100 SEQ=763976542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8CBB80000000001030307) Feb 1 04:19:14 localhost python3.9[195918]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937553.5287318-1662-9989773416232/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63859 DF PROTO=TCP SPT=41894 DPT=9101 SEQ=3109611722 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8CFB80000000001030307) Feb 1 04:19:15 localhost python3.9[196028]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:16 localhost python3.9[196118]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937554.8798206-1662-61719504725952/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:16 localhost python3.9[196228]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:17 localhost python3.9[196318]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937556.3027415-1662-60486938339059/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:18 localhost python3.9[196428]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:18 localhost python3.9[196518]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937557.549908-1662-35485151242422/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:19 localhost python3.9[196628]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:19 localhost python3.9[196718]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937558.7617545-1662-18753124785166/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:20 localhost python3.9[196828]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:21 localhost python3.9[196916]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937560.1859953-1662-211281036898881/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:21 localhost python3.9[197026]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26023 DF PROTO=TCP SPT=42676 DPT=9100 SEQ=763976542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8EBB80000000001030307) Feb 1 04:19:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8465 DF PROTO=TCP SPT=36482 DPT=9102 SEQ=1228658141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8EDB80000000001030307) Feb 1 04:19:23 localhost python3.9[197116]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937561.458384-1662-112107878930888/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:24 localhost python3.9[197226]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2260 DF PROTO=TCP SPT=60336 DPT=9882 SEQ=2191304184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8F5390000000001030307) Feb 1 04:19:25 localhost python3.9[197336]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:25 localhost python3.9[197446]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:26 localhost python3.9[197556]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2261 DF PROTO=TCP SPT=60336 DPT=9882 SEQ=2191304184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF8FD380000000001030307) Feb 1 04:19:27 localhost python3.9[197666]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:27 localhost python3.9[197776]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:28 localhost python3.9[197886]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:29 localhost python3.9[197996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:29 localhost python3.9[198106]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53975 DF PROTO=TCP SPT=41190 DPT=9101 SEQ=1787307462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9090A0000000001030307) Feb 1 04:19:30 localhost python3.9[198216]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:31 localhost python3.9[198326]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:31 localhost python3.9[198436]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:32 localhost python3.9[198546]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:33 localhost python3.9[198656]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53977 DF PROTO=TCP SPT=41190 DPT=9101 SEQ=1787307462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF914F90000000001030307) Feb 1 04:19:33 localhost python3.9[198766]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:34 localhost python3.9[198876]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:35 localhost python3.9[198964]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937573.905702-2325-105908057664203/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:35 localhost python3.9[199074]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:36 localhost python3.9[199162]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937575.1691406-2325-93109434491721/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:37 localhost python3.9[199272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53978 DF PROTO=TCP SPT=41190 DPT=9101 SEQ=1787307462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF924B80000000001030307) Feb 1 04:19:37 localhost python3.9[199360]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937576.5206792-2325-168414464811219/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:38 localhost python3.9[199470]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:38 localhost python3.9[199558]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937577.790548-2325-248696090343933/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2263 DF PROTO=TCP SPT=60336 DPT=9882 SEQ=2191304184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF92DB80000000001030307) Feb 1 04:19:39 localhost python3.9[199668]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:40 localhost python3.9[199756]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937579.096799-2325-42823906055860/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:40 localhost python3.9[199866]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:41 localhost python3.9[199954]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937580.3304486-2325-265583855828981/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:19:41.675 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:19:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:19:41.676 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:19:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:19:41.677 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:19:41 localhost python3.9[200064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:42 localhost python3.9[200152]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937581.50422-2325-133876031363175/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:43 localhost python3.9[200262]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:43 localhost python3.9[200350]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937582.8853455-2325-95503570542246/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49191 DF PROTO=TCP SPT=44794 DPT=9100 SEQ=2489843824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF940B80000000001030307) Feb 1 04:19:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:19:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:19:44 localhost podman[200461]: 2026-02-01 09:19:44.462763765 +0000 UTC m=+0.086614901 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:19:44 localhost podman[200461]: 2026-02-01 09:19:44.469216877 +0000 UTC m=+0.093067993 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127) Feb 1 04:19:44 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:19:44 localhost podman[200462]: 2026-02-01 09:19:44.52676404 +0000 UTC m=+0.148443079 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:19:44 localhost python3.9[200460]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:44 localhost podman[200462]: 2026-02-01 09:19:44.563457162 +0000 UTC m=+0.185136301 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Feb 1 04:19:44 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:19:45 localhost python3.9[200587]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937584.0478024-2325-120887281501081/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53979 DF PROTO=TCP SPT=41190 DPT=9101 SEQ=1787307462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF945B80000000001030307) Feb 1 04:19:45 localhost python3.9[200697]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:46 localhost python3.9[200785]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937585.2472026-2325-32037369703070/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:47 localhost python3.9[200895]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:47 localhost python3.9[200983]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937586.508477-2325-43567734852279/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:48 localhost python3.9[201093]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:49 localhost python3.9[201181]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937587.8132992-2325-95118070317755/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:49 localhost python3.9[201291]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:50 localhost python3.9[201379]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937589.2242944-2325-212387187025749/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:51 localhost python3.9[201489]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:19:51 localhost python3.9[201577]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937590.5146332-2325-270818066701194/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:19:52 localhost python3.9[201685]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:19:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49192 DF PROTO=TCP SPT=44794 DPT=9100 SEQ=2489843824 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF961B80000000001030307) Feb 1 04:19:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35935 DF PROTO=TCP SPT=53910 DPT=9102 SEQ=257127918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF963B80000000001030307) Feb 1 04:19:53 localhost python3.9[201798]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Feb 1 04:19:54 localhost python3.9[201908]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:19:54 localhost systemd[1]: Reloading. Feb 1 04:19:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4164 DF PROTO=TCP SPT=46918 DPT=9882 SEQ=2183438787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF96A980000000001030307) Feb 1 04:19:55 localhost systemd-sysv-generator[201936]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:19:55 localhost systemd-rc-local-generator[201931]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:55 localhost systemd[1]: Starting libvirt logging daemon socket... Feb 1 04:19:55 localhost systemd[1]: Listening on libvirt logging daemon socket. Feb 1 04:19:55 localhost systemd[1]: Starting libvirt logging daemon admin socket... Feb 1 04:19:55 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Feb 1 04:19:55 localhost systemd[1]: Starting libvirt logging daemon... Feb 1 04:19:55 localhost systemd[1]: Started libvirt logging daemon. Feb 1 04:19:56 localhost python3.9[202060]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:19:56 localhost systemd[1]: Reloading. Feb 1 04:19:56 localhost systemd-rc-local-generator[202086]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:19:56 localhost systemd-sysv-generator[202090]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:56 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4165 DF PROTO=TCP SPT=46918 DPT=9882 SEQ=2183438787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF972790000000001030307) Feb 1 04:19:57 localhost systemd[1]: Starting libvirt nodedev daemon socket... Feb 1 04:19:57 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Feb 1 04:19:57 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Feb 1 04:19:57 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Feb 1 04:19:57 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Feb 1 04:19:57 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Feb 1 04:19:57 localhost systemd[1]: Started libvirt nodedev daemon. Feb 1 04:19:57 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 1 04:19:57 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 1 04:19:57 localhost setroubleshoot[202104]: Deleting alert 77c42ba3-11b3-418c-ae14-6a879a7ca831, it is allowed in current policy Feb 1 04:19:57 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Feb 1 04:19:57 localhost python3.9[202243]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:19:57 localhost systemd[1]: Reloading. Feb 1 04:19:57 localhost systemd-sysv-generator[202273]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:19:57 localhost systemd-rc-local-generator[202269]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:58 localhost systemd[1]: Starting libvirt proxy daemon socket... Feb 1 04:19:58 localhost systemd[1]: Listening on libvirt proxy daemon socket. Feb 1 04:19:58 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Feb 1 04:19:58 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Feb 1 04:19:58 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Feb 1 04:19:58 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Feb 1 04:19:58 localhost systemd[1]: Started libvirt proxy daemon. Feb 1 04:19:58 localhost setroubleshoot[202104]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 17780c35-8d12-466f-af37-39af95aae931 Feb 1 04:19:58 localhost setroubleshoot[202104]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Feb 1 04:19:58 localhost setroubleshoot[202104]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 17780c35-8d12-466f-af37-39af95aae931 Feb 1 04:19:58 localhost setroubleshoot[202104]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Feb 1 04:19:59 localhost python3.9[202417]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:19:59 localhost systemd[1]: Reloading. Feb 1 04:19:59 localhost systemd-rc-local-generator[202439]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:19:59 localhost systemd-sysv-generator[202445]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:19:59 localhost systemd[1]: Listening on libvirt locking daemon socket. Feb 1 04:19:59 localhost systemd[1]: Starting libvirt QEMU daemon socket... Feb 1 04:19:59 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Feb 1 04:19:59 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Feb 1 04:19:59 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Feb 1 04:19:59 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Feb 1 04:19:59 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Feb 1 04:19:59 localhost systemd[1]: Started libvirt QEMU daemon. Feb 1 04:20:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38876 DF PROTO=TCP SPT=51906 DPT=9101 SEQ=442772924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF97E470000000001030307) Feb 1 04:20:00 localhost python3.9[202599]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:20:00 localhost systemd[1]: Reloading. Feb 1 04:20:00 localhost systemd-rc-local-generator[202630]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:20:00 localhost systemd-sysv-generator[202633]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:00 localhost systemd[1]: Starting libvirt secret daemon socket... Feb 1 04:20:00 localhost systemd[1]: Listening on libvirt secret daemon socket. Feb 1 04:20:00 localhost systemd[1]: Starting libvirt secret daemon admin socket... Feb 1 04:20:00 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Feb 1 04:20:00 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Feb 1 04:20:00 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Feb 1 04:20:00 localhost systemd[1]: Started libvirt secret daemon. Feb 1 04:20:01 localhost python3.9[202781]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:02 localhost python3.9[202891]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 1 04:20:02 localhost python3.9[203001]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:20:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38878 DF PROTO=TCP SPT=51906 DPT=9101 SEQ=442772924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF98A380000000001030307) Feb 1 04:20:03 localhost python3.9[203113]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 1 04:20:04 localhost python3.9[203221]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:05 localhost python3.9[203307]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937604.2588716-3189-275449478366167/.source.xml follow=False _original_basename=secret.xml.j2 checksum=8e79ccae86c93336b3974fdc11794b13702e9d6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:06 localhost python3.9[203417]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 33fac0b9-80c7-560f-918a-c92d3021ca1e#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:20:06 localhost python3.9[203537]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38879 DF PROTO=TCP SPT=51906 DPT=9101 SEQ=442772924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF999F80000000001030307) Feb 1 04:20:08 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Feb 1 04:20:08 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 1 04:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4167 DF PROTO=TCP SPT=46918 DPT=9882 SEQ=2183438787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9A1B90000000001030307) Feb 1 04:20:09 localhost sshd[203848]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:20:11 localhost python3.9[203960]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:11 localhost python3.9[204070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:12 localhost python3.9[204158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937611.4518087-3354-248601151551312/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:13 localhost python3.9[204268]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:14 localhost python3.9[204378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53473 DF PROTO=TCP SPT=40246 DPT=9100 SEQ=2888351650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9B5F90000000001030307) Feb 1 04:20:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:20:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:20:14 localhost python3.9[204435]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:14 localhost podman[204436]: 2026-02-01 09:20:14.793078348 +0000 UTC m=+0.147985172 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:20:14 localhost podman[204437]: 2026-02-01 09:20:14.755291047 +0000 UTC m=+0.109865299 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:20:14 localhost podman[204436]: 2026-02-01 09:20:14.82839416 +0000 UTC m=+0.183300954 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:20:14 localhost podman[204437]: 2026-02-01 09:20:14.837309871 +0000 UTC m=+0.191884063 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:20:14 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:20:14 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:20:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38880 DF PROTO=TCP SPT=51906 DPT=9101 SEQ=442772924 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9B9B80000000001030307) Feb 1 04:20:15 localhost python3.9[204590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:15 localhost python3.9[204647]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ycuv0acl recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:16 localhost python3.9[204757]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:17 localhost python3.9[204814]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:17 localhost python3.9[204924]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:20:18 localhost python3[205035]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 1 04:20:19 localhost python3.9[205145]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:20 localhost python3.9[205202]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:21 localhost python3.9[205312]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53474 DF PROTO=TCP SPT=40246 DPT=9100 SEQ=2888351650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9D5B80000000001030307) Feb 1 04:20:22 localhost python3.9[205402]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937620.4816878-3621-11164525868770/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8626 DF PROTO=TCP SPT=46760 DPT=9102 SEQ=4180898009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9D7B90000000001030307) Feb 1 04:20:23 localhost python3.9[205512]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:23 localhost python3.9[205569]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9868 DF PROTO=TCP SPT=33672 DPT=9882 SEQ=2884184244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9DFB80000000001030307) Feb 1 04:20:25 localhost python3.9[205679]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:25 localhost python3.9[205736]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:26 localhost python3.9[205846]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9869 DF PROTO=TCP SPT=33672 DPT=9882 SEQ=2884184244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9E7B90000000001030307) Feb 1 04:20:27 localhost python3.9[205936]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937625.9944718-3738-134020246472941/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:27 localhost python3.9[206046]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:28 localhost python3.9[206156]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:20:29 localhost python3.9[206269]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15692 DF PROTO=TCP SPT=60170 DPT=9101 SEQ=1010449723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9F3710000000001030307) Feb 1 04:20:30 localhost python3.9[206379]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:20:31 localhost python3.9[206490]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:20:31 localhost python3.9[206602]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:20:32 localhost python3.9[206715]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15694 DF PROTO=TCP SPT=60170 DPT=9101 SEQ=1010449723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CF9FF780000000001030307) Feb 1 04:20:33 localhost python3.9[206825]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:34 localhost python3.9[206913]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937633.0086365-3954-199039464586312/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:34 localhost python3.9[207023]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:35 localhost python3.9[207111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937634.32045-3999-58496679065707/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:37 localhost python3.9[207221]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:20:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15695 DF PROTO=TCP SPT=60170 DPT=9101 SEQ=1010449723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA0F380000000001030307) Feb 1 04:20:37 localhost python3.9[207309]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937636.467946-4044-105237220782491/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:20:38 localhost python3.9[207419]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:20:38 localhost systemd[1]: Reloading. Feb 1 04:20:38 localhost systemd-sysv-generator[207445]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:20:38 localhost systemd-rc-local-generator[207441]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:38 localhost systemd[1]: Reached target edpm_libvirt.target. Feb 1 04:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8627 DF PROTO=TCP SPT=46760 DPT=9102 SEQ=4180898009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA17B90000000001030307) Feb 1 04:20:40 localhost python3.9[207569]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 1 04:20:40 localhost systemd[1]: Reloading. Feb 1 04:20:40 localhost systemd-rc-local-generator[207594]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:20:40 localhost systemd-sysv-generator[207598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: Reloading. Feb 1 04:20:40 localhost systemd-rc-local-generator[207633]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:20:40 localhost systemd-sysv-generator[207638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:20:41 localhost systemd[1]: session-52.scope: Deactivated successfully. Feb 1 04:20:41 localhost systemd[1]: session-52.scope: Consumed 3min 35.520s CPU time. Feb 1 04:20:41 localhost systemd-logind[759]: Session 52 logged out. Waiting for processes to exit. Feb 1 04:20:41 localhost systemd-logind[759]: Removed session 52. Feb 1 04:20:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:20:41.677 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:20:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:20:41.678 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:20:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:20:41.679 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:20:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10906 DF PROTO=TCP SPT=54928 DPT=9100 SEQ=3226742828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA2B380000000001030307) Feb 1 04:20:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15696 DF PROTO=TCP SPT=60170 DPT=9101 SEQ=1010449723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA2FB90000000001030307) Feb 1 04:20:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:20:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:20:45 localhost systemd[1]: tmp-crun.bY3thg.mount: Deactivated successfully. Feb 1 04:20:45 localhost podman[207662]: 2026-02-01 09:20:45.739171291 +0000 UTC m=+0.093469325 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible) Feb 1 04:20:45 localhost podman[207662]: 2026-02-01 09:20:45.77326244 +0000 UTC m=+0.127560474 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:20:45 localhost systemd[1]: tmp-crun.K0zbnw.mount: Deactivated successfully. Feb 1 04:20:45 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:20:45 localhost podman[207663]: 2026-02-01 09:20:45.791870022 +0000 UTC m=+0.145771194 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:20:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:20:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 5701 writes, 25K keys, 5701 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5701 writes, 740 syncs, 7.70 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x561d72da42d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Feb 1 04:20:45 localhost podman[207663]: 2026-02-01 09:20:45.912545513 +0000 UTC m=+0.266446675 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0) Feb 1 04:20:45 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:20:46 localhost sshd[207706]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:20:46 localhost systemd-logind[759]: New session 53 of user zuul. Feb 1 04:20:46 localhost systemd[1]: Started Session 53 of User zuul. Feb 1 04:20:47 localhost python3.9[207817]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:20:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:20:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 4896 writes, 22K keys, 4896 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4896 writes, 685 syncs, 7.15 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.01 0.00 1 0.009 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55976bb302d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Feb 1 04:20:49 localhost python3.9[207929]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:20:49 localhost network[207946]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:20:49 localhost network[207947]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:20:49 localhost network[207948]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:20:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:20:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10907 DF PROTO=TCP SPT=54928 DPT=9100 SEQ=3226742828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA4BB90000000001030307) Feb 1 04:20:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47126 DF PROTO=TCP SPT=49618 DPT=9102 SEQ=1244612894 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA4DB80000000001030307) Feb 1 04:20:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46045 DF PROTO=TCP SPT=43930 DPT=9882 SEQ=1984126860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA54B80000000001030307) Feb 1 04:20:55 localhost python3.9[208180]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:20:56 localhost python3.9[208243]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:20:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46046 DF PROTO=TCP SPT=43930 DPT=9882 SEQ=1984126860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA5CB90000000001030307) Feb 1 04:21:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18528 DF PROTO=TCP SPT=56400 DPT=9101 SEQ=470557700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA689A0000000001030307) Feb 1 04:21:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18530 DF PROTO=TCP SPT=56400 DPT=9101 SEQ=470557700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA74B80000000001030307) Feb 1 04:21:06 localhost python3.9[208355]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:21:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18531 DF PROTO=TCP SPT=56400 DPT=9101 SEQ=470557700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA84780000000001030307) Feb 1 04:21:07 localhost python3.9[208467]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:08 localhost python3.9[208577]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:21:08 localhost python3.9[208688]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:21:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10908 DF PROTO=TCP SPT=54928 DPT=9100 SEQ=3226742828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFA8BB80000000001030307) Feb 1 04:21:09 localhost python3.9[208799]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:21:10 localhost python3.9[208910]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:21:11 localhost python3.9[209058]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:12 localhost python3.9[209203]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:21:12 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Feb 1 04:21:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13497 DF PROTO=TCP SPT=43744 DPT=9100 SEQ=1244732753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAA0780000000001030307) Feb 1 04:21:14 localhost python3.9[209331]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:21:14 localhost systemd[1]: Reloading. Feb 1 04:21:14 localhost systemd-rc-local-generator[209357]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:21:14 localhost systemd-sysv-generator[209362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:14 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:15 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Feb 1 04:21:15 localhost systemd[1]: Starting Open-iSCSI... Feb 1 04:21:15 localhost iscsid[209371]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 1 04:21:15 localhost iscsid[209371]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 1 04:21:15 localhost iscsid[209371]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 1 04:21:15 localhost iscsid[209371]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 1 04:21:15 localhost iscsid[209371]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 1 04:21:15 localhost iscsid[209371]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 1 04:21:15 localhost iscsid[209371]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Feb 1 04:21:15 localhost systemd[1]: Started Open-iSCSI. Feb 1 04:21:15 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Feb 1 04:21:15 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Feb 1 04:21:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18532 DF PROTO=TCP SPT=56400 DPT=9101 SEQ=470557700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAA5B80000000001030307) Feb 1 04:21:16 localhost python3.9[209480]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:21:16 localhost network[209497]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:21:16 localhost network[209498]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:21:16 localhost network[209499]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:21:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:21:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:21:16 localhost podman[209505]: 2026-02-01 09:21:16.333115021 +0000 UTC m=+0.090984894 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:21:16 localhost podman[209505]: 2026-02-01 09:21:16.368423512 +0000 UTC m=+0.126293445 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent) Feb 1 04:21:16 localhost podman[209506]: 2026-02-01 09:21:16.376599432 +0000 UTC m=+0.134191217 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:21:16 localhost podman[209506]: 2026-02-01 09:21:16.421454675 +0000 UTC m=+0.179046500 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127) Feb 1 04:21:16 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:21:16 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:21:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:21:17 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 1 04:21:18 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 1 04:21:18 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 3359f0a1-0166-4be0-89d9-de5063e3b1d4 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 3359f0a1-0166-4be0-89d9-de5063e3b1d4 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 3359f0a1-0166-4be0-89d9-de5063e3b1d4 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 3359f0a1-0166-4be0-89d9-de5063e3b1d4 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 3359f0a1-0166-4be0-89d9-de5063e3b1d4 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 3359f0a1-0166-4be0-89d9-de5063e3b1d4 Feb 1 04:21:19 localhost setroubleshoot[209602]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 1 04:21:22 localhost python3.9[209791]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:21:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22894 DF PROTO=TCP SPT=53706 DPT=9102 SEQ=937970713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAC1B80000000001030307) Feb 1 04:21:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13498 DF PROTO=TCP SPT=43744 DPT=9100 SEQ=1244732753 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAC1B80000000001030307) Feb 1 04:21:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9125 DF PROTO=TCP SPT=55064 DPT=9882 SEQ=1602416877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAC9F80000000001030307) Feb 1 04:21:26 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 04:21:26 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 04:21:26 localhost systemd[1]: Reloading. Feb 1 04:21:26 localhost systemd-sysv-generator[209840]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:21:26 localhost systemd-rc-local-generator[209834]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:26 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 04:21:26 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 04:21:26 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 04:21:26 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 04:21:26 localhost systemd[1]: run-r0f0ec2738f97409e91bc84b24d47d102.service: Deactivated successfully. Feb 1 04:21:26 localhost systemd[1]: run-r08bfe722d7e64e6e88bb7ce8c154bc1a.service: Deactivated successfully. Feb 1 04:21:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9126 DF PROTO=TCP SPT=55064 DPT=9882 SEQ=1602416877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAD1F90000000001030307) Feb 1 04:21:28 localhost python3.9[210081]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 1 04:21:29 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Feb 1 04:21:29 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 1 04:21:29 localhost python3.9[210191]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Feb 1 04:21:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27477 DF PROTO=TCP SPT=56324 DPT=9101 SEQ=3525418886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFADDCA0000000001030307) Feb 1 04:21:30 localhost python3.9[210306]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:21:30 localhost python3.9[210394]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937689.564282-483-258321186229648/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:31 localhost python3.9[210504]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:32 localhost python3.9[210614]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:21:32 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 1 04:21:32 localhost systemd[1]: Stopped Load Kernel Modules. Feb 1 04:21:32 localhost systemd[1]: Stopping Load Kernel Modules... Feb 1 04:21:32 localhost systemd[1]: Starting Load Kernel Modules... Feb 1 04:21:32 localhost systemd-modules-load[210618]: Module 'msr' is built in Feb 1 04:21:32 localhost systemd[1]: Finished Load Kernel Modules. Feb 1 04:21:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27479 DF PROTO=TCP SPT=56324 DPT=9101 SEQ=3525418886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAE9B80000000001030307) Feb 1 04:21:34 localhost python3.9[210728]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:21:34 localhost sshd[210747]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:21:35 localhost python3.9[210841]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:21:36 localhost python3.9[210951]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:21:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27480 DF PROTO=TCP SPT=56324 DPT=9101 SEQ=3525418886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFAF9790000000001030307) Feb 1 04:21:37 localhost python3.9[211039]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937696.3434575-636-14699305366224/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:37 localhost python3.9[211149]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:21:38 localhost python3.9[211260]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9128 DF PROTO=TCP SPT=55064 DPT=9882 SEQ=1602416877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB01B90000000001030307) Feb 1 04:21:39 localhost python3.9[211370]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:40 localhost python3.9[211480]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:41 localhost python3.9[211590]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:21:41.678 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:21:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:21:41.680 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:21:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:21:41.682 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:21:41 localhost python3.9[211700]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:42 localhost python3.9[211810]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:43 localhost python3.9[211920]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:44 localhost python3.9[212030]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:21:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16151 DF PROTO=TCP SPT=50048 DPT=9100 SEQ=745107316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB15780000000001030307) Feb 1 04:21:45 localhost python3.9[212142]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:21:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27481 DF PROTO=TCP SPT=56324 DPT=9101 SEQ=3525418886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB19B80000000001030307) Feb 1 04:21:45 localhost python3.9[212253]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:21:45 localhost systemd[1]: Listening on multipathd control socket. Feb 1 04:21:46 localhost python3.9[212367]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:21:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:21:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:21:46 localhost systemd[1]: Starting Wait for udev To Complete Device Initialization... Feb 1 04:21:47 localhost udevadm[212392]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in. Feb 1 04:21:47 localhost podman[212369]: 2026-02-01 09:21:47.029751771 +0000 UTC m=+0.073928124 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:21:47 localhost systemd[1]: Finished Wait for udev To Complete Device Initialization. Feb 1 04:21:47 localhost podman[212369]: 2026-02-01 09:21:47.040297103 +0000 UTC m=+0.084473466 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent) Feb 1 04:21:47 localhost systemd[1]: Starting Device-Mapper Multipath Device Controller... Feb 1 04:21:47 localhost podman[212370]: 2026-02-01 09:21:47.051257088 +0000 UTC m=+0.090858720 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:21:47 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:21:47 localhost multipathd[212410]: --------start up-------- Feb 1 04:21:47 localhost multipathd[212410]: read /etc/multipath.conf Feb 1 04:21:47 localhost multipathd[212410]: path checkers start up Feb 1 04:21:47 localhost systemd[1]: Started Device-Mapper Multipath Device Controller. Feb 1 04:21:47 localhost podman[212370]: 2026-02-01 09:21:47.11048131 +0000 UTC m=+0.150082902 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:21:47 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:21:48 localhost python3.9[212533]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 1 04:21:49 localhost python3.9[212643]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Feb 1 04:21:50 localhost python3.9[212761]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:21:51 localhost python3.9[212849]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937710.3440192-1026-108644422205166/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:52 localhost python3.9[212959]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:21:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16152 DF PROTO=TCP SPT=50048 DPT=9100 SEQ=745107316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB35B90000000001030307) Feb 1 04:21:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23040 DF PROTO=TCP SPT=35106 DPT=9102 SEQ=3408963132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB37B80000000001030307) Feb 1 04:21:53 localhost python3.9[213069]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:21:53 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 1 04:21:53 localhost systemd[1]: Stopped Load Kernel Modules. Feb 1 04:21:53 localhost systemd[1]: Stopping Load Kernel Modules... Feb 1 04:21:53 localhost systemd[1]: Starting Load Kernel Modules... Feb 1 04:21:53 localhost systemd-modules-load[213073]: Module 'msr' is built in Feb 1 04:21:53 localhost systemd[1]: Finished Load Kernel Modules. Feb 1 04:21:54 localhost python3.9[213183]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42987 DF PROTO=TCP SPT=45970 DPT=9882 SEQ=2812075096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB3F380000000001030307) Feb 1 04:21:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42988 DF PROTO=TCP SPT=45970 DPT=9882 SEQ=2812075096 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB47380000000001030307) Feb 1 04:21:58 localhost systemd[1]: Reloading. Feb 1 04:21:58 localhost systemd-sysv-generator[213224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:21:58 localhost systemd-rc-local-generator[213220]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Feb 1 04:21:58 localhost systemd[1]: Reloading. Feb 1 04:21:58 localhost systemd-rc-local-generator[213253]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:21:58 localhost systemd-sysv-generator[213256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:58 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button) Feb 1 04:21:58 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Feb 1 04:21:58 localhost lvm[213305]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 1 04:21:58 localhost lvm[213304]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 1 04:21:58 localhost lvm[213304]: VG ceph_vg1 finished Feb 1 04:21:58 localhost lvm[213305]: VG ceph_vg0 finished Feb 1 04:21:58 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 1 04:21:58 localhost systemd[1]: Starting man-db-cache-update.service... Feb 1 04:21:58 localhost systemd[1]: Reloading. Feb 1 04:21:58 localhost systemd-rc-local-generator[213353]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:21:58 localhost systemd-sysv-generator[213356]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:21:59 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 1 04:21:59 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 1 04:21:59 localhost systemd[1]: Finished man-db-cache-update.service. Feb 1 04:21:59 localhost systemd[1]: man-db-cache-update.service: Consumed 1.149s CPU time. Feb 1 04:21:59 localhost systemd[1]: run-r194a123dd2574347ae1ea99810c7b3ed.service: Deactivated successfully. Feb 1 04:22:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56863 DF PROTO=TCP SPT=53390 DPT=9101 SEQ=3430849650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB52FA0000000001030307) Feb 1 04:22:01 localhost python3.9[214614]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:22:01 localhost systemd[1]: Stopping Device-Mapper Multipath Device Controller... Feb 1 04:22:01 localhost multipathd[212410]: exit (signal) Feb 1 04:22:01 localhost multipathd[212410]: --------shut down------- Feb 1 04:22:01 localhost systemd[1]: multipathd.service: Deactivated successfully. Feb 1 04:22:01 localhost systemd[1]: Stopped Device-Mapper Multipath Device Controller. Feb 1 04:22:01 localhost systemd[1]: Starting Device-Mapper Multipath Device Controller... Feb 1 04:22:01 localhost multipathd[214620]: --------start up-------- Feb 1 04:22:01 localhost multipathd[214620]: read /etc/multipath.conf Feb 1 04:22:01 localhost multipathd[214620]: path checkers start up Feb 1 04:22:01 localhost systemd[1]: Started Device-Mapper Multipath Device Controller. Feb 1 04:22:02 localhost python3.9[214735]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:22:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56865 DF PROTO=TCP SPT=53390 DPT=9101 SEQ=3430849650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB5EF90000000001030307) Feb 1 04:22:03 localhost python3.9[214849]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:04 localhost python3.9[214959]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:22:04 localhost systemd[1]: Reloading. Feb 1 04:22:04 localhost systemd-rc-local-generator[214984]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:22:04 localhost systemd-sysv-generator[214989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:04 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:05 localhost python3.9[215103]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:22:05 localhost network[215120]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:22:05 localhost network[215121]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:22:05 localhost network[215122]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:22:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:22:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56866 DF PROTO=TCP SPT=53390 DPT=9101 SEQ=3430849650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB6EB80000000001030307) Feb 1 04:22:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16153 DF PROTO=TCP SPT=50048 DPT=9100 SEQ=745107316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB75D20000000001030307) Feb 1 04:22:09 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Feb 1 04:22:10 localhost python3.9[215356]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:11 localhost python3.9[215467]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:12 localhost python3.9[215578]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:12 localhost python3.9[215689]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:13 localhost python3.9[215852]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11176 DF PROTO=TCP SPT=47664 DPT=9100 SEQ=556045361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB8AB80000000001030307) Feb 1 04:22:15 localhost python3.9[215980]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56867 DF PROTO=TCP SPT=53390 DPT=9101 SEQ=3430849650 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFB8FB90000000001030307) Feb 1 04:22:16 localhost python3.9[216109]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:22:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:22:17 localhost systemd[1]: tmp-crun.YdRnKd.mount: Deactivated successfully. Feb 1 04:22:17 localhost podman[216180]: 2026-02-01 09:22:17.759050376 +0000 UTC m=+0.104804406 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller) Feb 1 04:22:17 localhost podman[216180]: 2026-02-01 09:22:17.800723022 +0000 UTC m=+0.146477002 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller) Feb 1 04:22:17 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:22:17 localhost systemd[1]: tmp-crun.mcACsW.mount: Deactivated successfully. Feb 1 04:22:17 localhost podman[216174]: 2026-02-01 09:22:17.843318757 +0000 UTC m=+0.189082637 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:22:17 localhost podman[216174]: 2026-02-01 09:22:17.873039335 +0000 UTC m=+0.218803215 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 04:22:17 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:22:18 localhost python3.9[216261]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:22:20 localhost python3.9[216372]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:21 localhost python3.9[216482]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:21 localhost python3.9[216592]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:22 localhost python3.9[216702]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11177 DF PROTO=TCP SPT=47664 DPT=9100 SEQ=556045361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBABB80000000001030307) Feb 1 04:22:23 localhost python3.9[216812]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51009 DF PROTO=TCP SPT=60058 DPT=9102 SEQ=4201635467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBADB90000000001030307) Feb 1 04:22:23 localhost python3.9[216922]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:24 localhost python3.9[217032]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:24 localhost python3.9[217142]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18008 DF PROTO=TCP SPT=49756 DPT=9882 SEQ=1032992402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBB4780000000001030307) Feb 1 04:22:25 localhost python3.9[217252]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:26 localhost python3.9[217362]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:26 localhost python3.9[217472]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18009 DF PROTO=TCP SPT=49756 DPT=9882 SEQ=1032992402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBBC780000000001030307) Feb 1 04:22:27 localhost journal[202103]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, ) Feb 1 04:22:27 localhost journal[202103]: hostname: np0005604212.localdomain Feb 1 04:22:27 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 1 04:22:27 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:22:27 localhost journal[202103]: Make forcefull daemon shutdown Feb 1 04:22:27 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:22:27 localhost systemd[1]: virtnodedevd.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:22:27 localhost systemd[1]: virtnodedevd.service: Failed with result 'exit-code'. Feb 1 04:22:27 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:22:27 localhost systemd[1]: virtnodedevd.service: Scheduled restart job, restart counter is at 1. Feb 1 04:22:27 localhost systemd[1]: Stopped libvirt nodedev daemon. Feb 1 04:22:27 localhost systemd[1]: Started libvirt nodedev daemon. Feb 1 04:22:27 localhost python3.9[217583]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:28 localhost python3.9[217716]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:28 localhost python3.9[217826]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:29 localhost sshd[217936]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:22:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16691 DF PROTO=TCP SPT=55866 DPT=9101 SEQ=146214834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBC82A0000000001030307) Feb 1 04:22:30 localhost python3.9[217937]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:30 localhost python3.9[218048]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:22:32 localhost python3.9[218158]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16693 DF PROTO=TCP SPT=55866 DPT=9101 SEQ=146214834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBD4390000000001030307) Feb 1 04:22:33 localhost python3.9[218268]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 1 04:22:34 localhost python3.9[218378]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:22:34 localhost systemd[1]: Reloading. Feb 1 04:22:34 localhost systemd-rc-local-generator[218404]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:22:34 localhost systemd-sysv-generator[218407]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:22:35 localhost python3.9[218525]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:36 localhost python3.9[218636]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:36 localhost python3.9[218747]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16694 DF PROTO=TCP SPT=55866 DPT=9101 SEQ=146214834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBE3F80000000001030307) Feb 1 04:22:37 localhost python3.9[218858]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:38 localhost python3.9[218969]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:38 localhost python3.9[219080]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11178 DF PROTO=TCP SPT=47664 DPT=9100 SEQ=556045361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBEBB80000000001030307) Feb 1 04:22:39 localhost python3.9[219191]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:39 localhost python3.9[219302]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:22:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:22:41.679 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:22:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:22:41.681 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:22:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:22:41.682 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:22:42 localhost python3.9[219413]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:43 localhost python3.9[219523]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35768 DF PROTO=TCP SPT=41438 DPT=9100 SEQ=4174259046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFBFFF80000000001030307) Feb 1 04:22:44 localhost python3.9[219633]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16695 DF PROTO=TCP SPT=55866 DPT=9101 SEQ=146214834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC03B80000000001030307) Feb 1 04:22:45 localhost python3.9[219743]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:46 localhost python3.9[219853]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:46 localhost python3.9[219963]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:47 localhost python3.9[220073]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:47 localhost python3.9[220183]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:22:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:22:48 localhost podman[220295]: 2026-02-01 09:22:48.518180759 +0000 UTC m=+0.095214780 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible) Feb 1 04:22:48 localhost podman[220295]: 2026-02-01 09:22:48.559643059 +0000 UTC m=+0.136677160 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:22:48 localhost systemd[1]: tmp-crun.MKm98l.mount: Deactivated successfully. Feb 1 04:22:48 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:22:48 localhost python3.9[220293]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:48 localhost podman[220294]: 2026-02-01 09:22:48.571867596 +0000 UTC m=+0.148235897 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:22:48 localhost podman[220294]: 2026-02-01 09:22:48.656522159 +0000 UTC m=+0.232890480 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 1 04:22:48 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:22:49 localhost python3.9[220446]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:22:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35769 DF PROTO=TCP SPT=41438 DPT=9100 SEQ=4174259046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC1FB80000000001030307) Feb 1 04:22:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3356 DF PROTO=TCP SPT=35200 DPT=9102 SEQ=2922423407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC21B90000000001030307) Feb 1 04:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10053 DF PROTO=TCP SPT=54658 DPT=9882 SEQ=3518513039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC29780000000001030307) Feb 1 04:22:56 localhost python3.9[220556]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Feb 1 04:22:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10054 DF PROTO=TCP SPT=54658 DPT=9882 SEQ=3518513039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC31790000000001030307) Feb 1 04:22:57 localhost python3.9[220667]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 1 04:22:58 localhost python3.9[220783]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604212.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Feb 1 04:22:59 localhost sshd[220809]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:22:59 localhost systemd-logind[759]: New session 54 of user zuul. Feb 1 04:22:59 localhost systemd[1]: Started Session 54 of User zuul. Feb 1 04:22:59 localhost systemd[1]: session-54.scope: Deactivated successfully. Feb 1 04:22:59 localhost systemd-logind[759]: Session 54 logged out. Waiting for processes to exit. Feb 1 04:22:59 localhost systemd-logind[759]: Removed session 54. Feb 1 04:23:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17107 DF PROTO=TCP SPT=52962 DPT=9101 SEQ=3510488391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC3D5A0000000001030307) Feb 1 04:23:00 localhost python3.9[220920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:01 localhost python3.9[221006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937780.202557-2609-19082568404987/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:01 localhost python3.9[221114]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:02 localhost python3.9[221169]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:03 localhost python3.9[221277]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17109 DF PROTO=TCP SPT=52962 DPT=9101 SEQ=3510488391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC49780000000001030307) Feb 1 04:23:03 localhost python3.9[221363]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937782.550605-2609-203624412166695/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:04 localhost python3.9[221471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:04 localhost python3.9[221557]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937783.6393676-2609-62265377123694/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=81fac5bfb76f59376b169cd323b581eaa2259497 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:05 localhost python3.9[221665]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:05 localhost python3.9[221751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937784.750467-2609-27009937925670/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:06 localhost python3.9[221859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17110 DF PROTO=TCP SPT=52962 DPT=9101 SEQ=3510488391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC59390000000001030307) Feb 1 04:23:07 localhost python3.9[221945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937785.8933346-2609-76969579514808/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10056 DF PROTO=TCP SPT=54658 DPT=9882 SEQ=3518513039 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC61B80000000001030307) Feb 1 04:23:09 localhost python3.9[222055]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:23:10 localhost python3.9[222165]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:23:13 localhost python3.9[222275]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:23:13 localhost python3.9[222387]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:23:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34760 DF PROTO=TCP SPT=33872 DPT=9100 SEQ=179414624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC75390000000001030307) Feb 1 04:23:14 localhost python3.9[222495]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:23:15 localhost python3.9[222605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17111 DF PROTO=TCP SPT=52962 DPT=9101 SEQ=3510488391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC79B80000000001030307) Feb 1 04:23:15 localhost python3.9[222691]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937794.9458976-2984-254061093372133/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:16 localhost python3.9[222849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:23:17 localhost python3.9[222952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937796.1142805-3029-193768059620666/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:23:18 localhost python3.9[223080]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Feb 1 04:23:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:23:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:23:18 localhost podman[223098]: 2026-02-01 09:23:18.740744247 +0000 UTC m=+0.097930665 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:23:18 localhost podman[223098]: 2026-02-01 09:23:18.789543334 +0000 UTC m=+0.146729712 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 04:23:18 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:23:18 localhost podman[223131]: 2026-02-01 09:23:18.840527288 +0000 UTC m=+0.092387055 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:23:18 localhost podman[223131]: 2026-02-01 09:23:18.870979712 +0000 UTC m=+0.122839499 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:23:18 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:23:19 localhost python3.9[223234]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:23:20 localhost python3[223344]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:23:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34761 DF PROTO=TCP SPT=33872 DPT=9100 SEQ=179414624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC95B80000000001030307) Feb 1 04:23:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17269 DF PROTO=TCP SPT=37530 DPT=9102 SEQ=1598013957 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC97B80000000001030307) Feb 1 04:23:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39092 DF PROTO=TCP SPT=38340 DPT=9882 SEQ=578346032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFC9EB90000000001030307) Feb 1 04:23:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39093 DF PROTO=TCP SPT=38340 DPT=9882 SEQ=578346032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFCA6B90000000001030307) Feb 1 04:23:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54272 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=3983246002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFCB28B0000000001030307) Feb 1 04:23:31 localhost podman[223357]: 2026-02-01 09:23:20.81924833 +0000 UTC m=+0.044623050 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 1 04:23:31 localhost podman[223418]: Feb 1 04:23:31 localhost podman[223418]: 2026-02-01 09:23:31.297821611 +0000 UTC m=+0.072340420 container create 119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=nova_compute_init, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:23:31 localhost podman[223418]: 2026-02-01 09:23:31.256754971 +0000 UTC m=+0.031273810 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 1 04:23:31 localhost python3[223344]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Feb 1 04:23:32 localhost python3.9[223566]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:23:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54274 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=3983246002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFCBE780000000001030307) Feb 1 04:23:33 localhost python3.9[223678]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Feb 1 04:23:34 localhost python3.9[223788]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:23:35 localhost python3[223898]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:23:35 localhost python3[223898]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",#012 "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:31:38.534497001Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1214548351,#012 "VirtualSize": 1214548351,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",#012 "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 1 04:23:36 localhost podman[223949]: 2026-02-01 09:23:36.045701072 +0000 UTC m=+0.065525551 container remove 5995785e4c08cea02919622c91d29b812b1b0ff815238da5474f0e94d3460032 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': 'a46ef4c25933bba0e125120095b56cb6-9ec539c069b98a16ced7663e9b12641d'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=) Feb 1 04:23:36 localhost python3[223898]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Feb 1 04:23:36 localhost podman[223963]: Feb 1 04:23:36 localhost podman[223963]: 2026-02-01 09:23:36.144769501 +0000 UTC m=+0.081909814 container create eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Feb 1 04:23:36 localhost podman[223963]: 2026-02-01 09:23:36.095244011 +0000 UTC m=+0.032384394 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 1 04:23:36 localhost python3[223898]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Feb 1 04:23:37 localhost python3.9[224110]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:23:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54275 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=3983246002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFCCE390000000001030307) Feb 1 04:23:37 localhost python3.9[224222]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:23:38 localhost python3.9[224332]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937818.0784104-3317-56989162499485/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:23:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34762 DF PROTO=TCP SPT=33872 DPT=9100 SEQ=179414624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFCD5B90000000001030307) Feb 1 04:23:39 localhost python3.9[224387]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:23:39 localhost systemd[1]: Reloading. Feb 1 04:23:39 localhost systemd-sysv-generator[224412]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:23:39 localhost systemd-rc-local-generator[224408]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:39 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost python3.9[224610]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:23:40 localhost systemd[1]: Reloading. Feb 1 04:23:40 localhost systemd-rc-local-generator[224638]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:23:40 localhost systemd-sysv-generator[224642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:23:40 localhost systemd[1]: Starting nova_compute container... Feb 1 04:23:41 localhost systemd[1]: tmp-crun.Rkl5oz.mount: Deactivated successfully. Feb 1 04:23:41 localhost systemd[1]: Started libcrun container. Feb 1 04:23:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:41 localhost podman[224650]: 2026-02-01 09:23:41.106478101 +0000 UTC m=+0.150010942 container init eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:23:41 localhost podman[224650]: 2026-02-01 09:23:41.117647633 +0000 UTC m=+0.161180464 container start eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2) Feb 1 04:23:41 localhost podman[224650]: nova_compute Feb 1 04:23:41 localhost nova_compute[224665]: + sudo -E kolla_set_configs Feb 1 04:23:41 localhost systemd[1]: Started nova_compute container. Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Validating config file Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying service configuration files Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Deleting /etc/ceph Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Creating directory /etc/ceph Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/ceph Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Writing out command to execute Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:23:41 localhost nova_compute[224665]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 1 04:23:41 localhost nova_compute[224665]: ++ cat /run_command Feb 1 04:23:41 localhost nova_compute[224665]: + CMD=nova-compute Feb 1 04:23:41 localhost nova_compute[224665]: + ARGS= Feb 1 04:23:41 localhost nova_compute[224665]: + sudo kolla_copy_cacerts Feb 1 04:23:41 localhost nova_compute[224665]: + [[ ! -n '' ]] Feb 1 04:23:41 localhost nova_compute[224665]: + . kolla_extend_start Feb 1 04:23:41 localhost nova_compute[224665]: Running command: 'nova-compute' Feb 1 04:23:41 localhost nova_compute[224665]: + echo 'Running command: '\''nova-compute'\''' Feb 1 04:23:41 localhost nova_compute[224665]: + umask 0022 Feb 1 04:23:41 localhost nova_compute[224665]: + exec nova-compute Feb 1 04:23:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:23:41.680 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:23:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:23:41.681 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:23:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:23:41.682 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:23:42 localhost python3.9[224785]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:23:42 localhost nova_compute[224665]: 2026-02-01 09:23:42.904 224669 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:23:42 localhost nova_compute[224665]: 2026-02-01 09:23:42.904 224669 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:23:42 localhost nova_compute[224665]: 2026-02-01 09:23:42.905 224669 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:23:42 localhost nova_compute[224665]: 2026-02-01 09:23:42.905 224669 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.036 224669 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.056 224669 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.056 224669 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.453 224669 INFO nova.virt.driver [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.572 224669 INFO nova.compute.provider_config [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.582 224669 WARNING nova.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.582 224669 DEBUG oslo_concurrency.lockutils [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.582 224669 DEBUG oslo_concurrency.lockutils [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.582 224669 DEBUG oslo_concurrency.lockutils [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.583 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.583 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.583 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.583 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.583 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.583 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.584 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.585 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.585 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.585 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.585 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.585 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.585 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] console_host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.585 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.586 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.587 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.587 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.587 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.587 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.587 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.587 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.587 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.588 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.588 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.588 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.588 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.588 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.588 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.588 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.589 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.590 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.591 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.592 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.592 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.592 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.592 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.592 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.592 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.592 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.593 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.594 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.594 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.594 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.594 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.594 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.594 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.594 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.595 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.596 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.597 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.598 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.598 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.598 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.598 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.598 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.598 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.598 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.599 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.599 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.599 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.599 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.599 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.599 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.599 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.600 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.600 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.600 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.600 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.600 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.600 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.600 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.601 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.602 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.602 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.602 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.602 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.602 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.602 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.602 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.603 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.603 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.603 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.603 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.603 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.603 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.603 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.604 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.605 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.605 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.605 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.605 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.605 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.605 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.605 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.606 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.606 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.606 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.606 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.606 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.606 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.606 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.607 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.607 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.607 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.607 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.607 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.607 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.607 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.608 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.608 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.608 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.608 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.608 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.609 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.609 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.609 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.609 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.609 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.609 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.609 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.610 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.610 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.610 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.610 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.610 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.610 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.610 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.611 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.611 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.611 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.611 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.611 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.611 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.611 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.612 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.612 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.612 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.612 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.612 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.612 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.612 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.613 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.614 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.614 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.614 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.614 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.614 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.614 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.615 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.615 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.615 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.615 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.615 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.615 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.615 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.616 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.616 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.616 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.616 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.616 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.616 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.616 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.617 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.618 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.618 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.618 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.618 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.618 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.618 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.618 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.619 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.619 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.619 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.619 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.619 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.619 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.620 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.620 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.620 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.620 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.620 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.620 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.620 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.621 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.622 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.622 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.622 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.622 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.622 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.622 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.622 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.623 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.623 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.623 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.623 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.623 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.623 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.623 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.624 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.625 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.625 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.625 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.625 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.625 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.625 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.625 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.626 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.626 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.626 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.626 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.626 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.626 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.626 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.627 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.627 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.627 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.627 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.627 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.627 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.627 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.628 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.629 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.629 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.629 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.629 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.629 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.629 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.629 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.630 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.630 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.630 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.630 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.630 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.630 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.630 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.631 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.631 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.631 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.631 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.631 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.631 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.631 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.632 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.632 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.632 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.632 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.632 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.633 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.633 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.633 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.633 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.633 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.633 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.633 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.634 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.634 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.634 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.634 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.634 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.634 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.634 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.635 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.636 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.636 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.636 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.636 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.636 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.636 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.636 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.637 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.637 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.637 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.637 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.637 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.637 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.637 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.638 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.638 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.638 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.638 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.638 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.638 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.638 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.639 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.639 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.639 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.639 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.639 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.639 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.639 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.640 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.640 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.640 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.640 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.640 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.640 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.640 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.641 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.642 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.642 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.642 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.642 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.642 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.642 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.642 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.643 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.643 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.643 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.643 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.643 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.643 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.643 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.644 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.644 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.644 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.644 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.644 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.644 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.644 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.645 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.645 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.645 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.645 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.645 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.645 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.645 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.646 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.646 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.646 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.646 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.646 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.646 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.646 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.647 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.648 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.648 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.648 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.648 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.648 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.648 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.648 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.649 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.649 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.649 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.649 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.649 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.649 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.649 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.650 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.650 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.650 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.650 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.650 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.650 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.651 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.651 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.651 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.651 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.651 224669 WARNING oslo_config.cfg [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 1 04:23:43 localhost nova_compute[224665]: live_migration_uri is deprecated for removal in favor of two other options that Feb 1 04:23:43 localhost nova_compute[224665]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 1 04:23:43 localhost nova_compute[224665]: and ``live_migration_inbound_addr`` respectively. Feb 1 04:23:43 localhost nova_compute[224665]: ). Its value may be silently ignored in the future.#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.651 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.652 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.652 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.652 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.652 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.652 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.652 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.652 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.653 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.653 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.653 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.653 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.653 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.653 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.653 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.654 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.654 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.654 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.654 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rbd_secret_uuid = 33fac0b9-80c7-560f-918a-c92d3021ca1e log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.654 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.654 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.655 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.655 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.655 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.655 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.655 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.655 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.655 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.656 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.656 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.656 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.656 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.656 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.656 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.656 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.657 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.657 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.657 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.657 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.657 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.657 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.657 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.658 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.658 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.658 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.658 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.658 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.658 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.658 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.659 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.659 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.659 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.659 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.659 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.659 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.659 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.660 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.660 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.660 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.660 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.660 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.660 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.660 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.661 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.661 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.661 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.661 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.661 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.661 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.661 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.662 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.662 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.662 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.662 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.662 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.662 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.662 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.663 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.663 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.663 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.663 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.663 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.664 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.664 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.664 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.664 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.664 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.664 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.664 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.665 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.665 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.665 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.665 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.665 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.665 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.665 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.666 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.666 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.666 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.666 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.666 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.666 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.666 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.667 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.668 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.668 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.668 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.668 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.668 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.668 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.669 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.669 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.669 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.669 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.669 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.669 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.669 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.670 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.670 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.670 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.670 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.670 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.670 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.670 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.671 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.671 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.671 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.671 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.671 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.671 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.671 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.672 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.672 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.672 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.672 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.672 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.672 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.673 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.673 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.673 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.673 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.673 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.673 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.673 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.674 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.674 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.674 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.674 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.674 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.674 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.674 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.675 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.675 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.675 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.675 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.675 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.675 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.675 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.676 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.677 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.677 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.677 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.677 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.677 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.677 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.677 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.678 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.678 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.678 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.678 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.678 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.678 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.678 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.679 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.679 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.679 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.679 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.679 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.679 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.679 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.680 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.680 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.680 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.680 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.680 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.680 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.681 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.682 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.683 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.683 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.683 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.683 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.683 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.683 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.683 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.684 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.685 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.685 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.685 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.685 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.685 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.685 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.685 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.686 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.687 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.687 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.687 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.687 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.687 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.687 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.688 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.688 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.688 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.688 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.688 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.688 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.688 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.689 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.689 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.689 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.689 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.689 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.689 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.689 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.690 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.690 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.690 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.690 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.690 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.690 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.690 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.691 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.691 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.691 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.691 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.691 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.691 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.692 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.692 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.692 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.692 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.692 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.692 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.692 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.693 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.693 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.693 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.693 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.693 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.693 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.694 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.694 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.694 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.694 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.694 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.694 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.694 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.695 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.695 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.695 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.695 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.695 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.695 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.695 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.696 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.697 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.697 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.697 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.697 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.697 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.697 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.697 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.698 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.698 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.698 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.698 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.698 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.698 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.698 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.699 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.699 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.699 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.699 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.699 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.699 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.699 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.700 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.700 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.700 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.700 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.700 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.700 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.700 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.701 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.701 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.701 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.701 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.701 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.701 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.701 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.702 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.703 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.703 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.703 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.703 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.703 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.703 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.703 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.704 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.704 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.704 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.704 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.704 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.704 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.704 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.705 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.706 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.706 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.706 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.706 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.706 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.706 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.706 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.707 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.708 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.708 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.708 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.708 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.708 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.708 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.708 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.709 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.709 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.709 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.709 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.709 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.709 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.709 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.710 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.710 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.710 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.710 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.710 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.710 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.710 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.711 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.711 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.711 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.711 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.711 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.711 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.711 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.712 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.712 224669 DEBUG oslo_service.service [None req-28548807-104c-4eaf-afd9-006be9ef8231 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.713 224669 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.725 224669 INFO nova.virt.node [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Determined node identity a04bda90-8ccd-4104-8518-038544ff1327 from /var/lib/nova/compute_id#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.726 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.727 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.727 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.727 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.737 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.740 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.742 224669 INFO nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Connection event '1' reason 'None'#033[00m Feb 1 04:23:43 localhost python3.9[224897]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.762 224669 DEBUG nova.virt.libvirt.volume.mount [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.764 224669 INFO nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Libvirt host capabilities Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 9037fad6-143b-4373-b625-f89bce657827 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: x86_64 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v4 Feb 1 04:23:43 localhost nova_compute[224665]: AMD Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: tcp Feb 1 04:23:43 localhost nova_compute[224665]: rdma Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 16116612 Feb 1 04:23:43 localhost nova_compute[224665]: 4029153 Feb 1 04:23:43 localhost nova_compute[224665]: 0 Feb 1 04:23:43 localhost nova_compute[224665]: 0 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: selinux Feb 1 04:23:43 localhost nova_compute[224665]: 0 Feb 1 04:23:43 localhost nova_compute[224665]: system_u:system_r:svirt_t:s0 Feb 1 04:23:43 localhost nova_compute[224665]: system_u:system_r:svirt_tcg_t:s0 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: dac Feb 1 04:23:43 localhost nova_compute[224665]: 0 Feb 1 04:23:43 localhost nova_compute[224665]: +107:+107 Feb 1 04:23:43 localhost nova_compute[224665]: +107:+107 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: hvm Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 32 Feb 1 04:23:43 localhost nova_compute[224665]: /usr/libexec/qemu-kvm Feb 1 04:23:43 localhost nova_compute[224665]: pc-i440fx-rhel7.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.8.0 Feb 1 04:23:43 localhost nova_compute[224665]: q35 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.4.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.5.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.3.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel7.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.4.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.2.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.2.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.0.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.0.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.1.0 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: hvm Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 64 Feb 1 04:23:43 localhost nova_compute[224665]: /usr/libexec/qemu-kvm Feb 1 04:23:43 localhost nova_compute[224665]: pc-i440fx-rhel7.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.8.0 Feb 1 04:23:43 localhost nova_compute[224665]: q35 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.4.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.5.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.3.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel7.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.4.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.2.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.2.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.0.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.0.0 Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel8.1.0 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: #033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.770 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.790 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: /usr/libexec/qemu-kvm Feb 1 04:23:43 localhost nova_compute[224665]: kvm Feb 1 04:23:43 localhost nova_compute[224665]: pc-i440fx-rhel7.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: i686 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: rom Feb 1 04:23:43 localhost nova_compute[224665]: pflash Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: yes Feb 1 04:23:43 localhost nova_compute[224665]: no Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: no Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:43 localhost nova_compute[224665]: AMD Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 486 Feb 1 04:23:43 localhost nova_compute[224665]: 486-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: ClearwaterForest Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: ClearwaterForest-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Conroe Feb 1 04:23:43 localhost nova_compute[224665]: Conroe-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-IBPB Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v4 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v5 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Turin Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Turin-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v1 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v2 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v6 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v7 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: KnightsMill Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: KnightsMill-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G1-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G2 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G2-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G3 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G3-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G4-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G5-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Penryn Feb 1 04:23:43 localhost nova_compute[224665]: Penryn-v1 Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Westmere Feb 1 04:23:43 localhost nova_compute[224665]: Westmere-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Westmere-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Westmere-v2 Feb 1 04:23:43 localhost nova_compute[224665]: athlon Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: athlon-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: core2duo Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: core2duo-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: coreduo Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: coreduo-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: kvm32 Feb 1 04:23:43 localhost nova_compute[224665]: kvm32-v1 Feb 1 04:23:43 localhost nova_compute[224665]: kvm64 Feb 1 04:23:43 localhost nova_compute[224665]: kvm64-v1 Feb 1 04:23:43 localhost nova_compute[224665]: n270 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: n270-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: pentium Feb 1 04:23:43 localhost nova_compute[224665]: pentium-v1 Feb 1 04:23:43 localhost nova_compute[224665]: pentium2 Feb 1 04:23:43 localhost nova_compute[224665]: pentium2-v1 Feb 1 04:23:43 localhost nova_compute[224665]: pentium3 Feb 1 04:23:43 localhost nova_compute[224665]: pentium3-v1 Feb 1 04:23:43 localhost nova_compute[224665]: phenom Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: phenom-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: qemu32 Feb 1 04:23:43 localhost nova_compute[224665]: qemu32-v1 Feb 1 04:23:43 localhost nova_compute[224665]: qemu64 Feb 1 04:23:43 localhost nova_compute[224665]: qemu64-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: file Feb 1 04:23:43 localhost nova_compute[224665]: anonymous Feb 1 04:23:43 localhost nova_compute[224665]: memfd Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: disk Feb 1 04:23:43 localhost nova_compute[224665]: cdrom Feb 1 04:23:43 localhost nova_compute[224665]: floppy Feb 1 04:23:43 localhost nova_compute[224665]: lun Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: ide Feb 1 04:23:43 localhost nova_compute[224665]: fdc Feb 1 04:23:43 localhost nova_compute[224665]: scsi Feb 1 04:23:43 localhost nova_compute[224665]: virtio Feb 1 04:23:43 localhost nova_compute[224665]: usb Feb 1 04:23:43 localhost nova_compute[224665]: sata Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: virtio Feb 1 04:23:43 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:43 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: vnc Feb 1 04:23:43 localhost nova_compute[224665]: egl-headless Feb 1 04:23:43 localhost nova_compute[224665]: dbus Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: subsystem Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: default Feb 1 04:23:43 localhost nova_compute[224665]: mandatory Feb 1 04:23:43 localhost nova_compute[224665]: requisite Feb 1 04:23:43 localhost nova_compute[224665]: optional Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: usb Feb 1 04:23:43 localhost nova_compute[224665]: pci Feb 1 04:23:43 localhost nova_compute[224665]: scsi Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: virtio Feb 1 04:23:43 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:43 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: random Feb 1 04:23:43 localhost nova_compute[224665]: egd Feb 1 04:23:43 localhost nova_compute[224665]: builtin Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: path Feb 1 04:23:43 localhost nova_compute[224665]: handle Feb 1 04:23:43 localhost nova_compute[224665]: virtiofs Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: tpm-tis Feb 1 04:23:43 localhost nova_compute[224665]: tpm-crb Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: emulator Feb 1 04:23:43 localhost nova_compute[224665]: external Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 2.0 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: usb Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: pty Feb 1 04:23:43 localhost nova_compute[224665]: unix Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: qemu Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: builtin Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: default Feb 1 04:23:43 localhost nova_compute[224665]: passt Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: isa Feb 1 04:23:43 localhost nova_compute[224665]: hyperv Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: null Feb 1 04:23:43 localhost nova_compute[224665]: vc Feb 1 04:23:43 localhost nova_compute[224665]: pty Feb 1 04:23:43 localhost nova_compute[224665]: dev Feb 1 04:23:43 localhost nova_compute[224665]: file Feb 1 04:23:43 localhost nova_compute[224665]: pipe Feb 1 04:23:43 localhost nova_compute[224665]: stdio Feb 1 04:23:43 localhost nova_compute[224665]: udp Feb 1 04:23:43 localhost nova_compute[224665]: tcp Feb 1 04:23:43 localhost nova_compute[224665]: unix Feb 1 04:23:43 localhost nova_compute[224665]: qemu-vdagent Feb 1 04:23:43 localhost nova_compute[224665]: dbus Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: relaxed Feb 1 04:23:43 localhost nova_compute[224665]: vapic Feb 1 04:23:43 localhost nova_compute[224665]: spinlocks Feb 1 04:23:43 localhost nova_compute[224665]: vpindex Feb 1 04:23:43 localhost nova_compute[224665]: runtime Feb 1 04:23:43 localhost nova_compute[224665]: synic Feb 1 04:23:43 localhost nova_compute[224665]: stimer Feb 1 04:23:43 localhost nova_compute[224665]: reset Feb 1 04:23:43 localhost nova_compute[224665]: vendor_id Feb 1 04:23:43 localhost nova_compute[224665]: frequencies Feb 1 04:23:43 localhost nova_compute[224665]: reenlightenment Feb 1 04:23:43 localhost nova_compute[224665]: tlbflush Feb 1 04:23:43 localhost nova_compute[224665]: ipi Feb 1 04:23:43 localhost nova_compute[224665]: avic Feb 1 04:23:43 localhost nova_compute[224665]: emsr_bitmap Feb 1 04:23:43 localhost nova_compute[224665]: xmm_input Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 4095 Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Linux KVM Hv Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.797 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: /usr/libexec/qemu-kvm Feb 1 04:23:43 localhost nova_compute[224665]: kvm Feb 1 04:23:43 localhost nova_compute[224665]: pc-q35-rhel9.8.0 Feb 1 04:23:43 localhost nova_compute[224665]: i686 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: rom Feb 1 04:23:43 localhost nova_compute[224665]: pflash Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: yes Feb 1 04:23:43 localhost nova_compute[224665]: no Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: no Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:43 localhost nova_compute[224665]: AMD Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 486 Feb 1 04:23:43 localhost nova_compute[224665]: 486-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: ClearwaterForest Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: ClearwaterForest-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Conroe Feb 1 04:23:43 localhost nova_compute[224665]: Conroe-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-IBPB Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v4 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v5 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Turin Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Turin-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v1 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v2 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v6 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v7 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: KnightsMill Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: KnightsMill-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G1-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G2 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G2-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G3 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G3-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G4-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G5-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Penryn Feb 1 04:23:43 localhost nova_compute[224665]: Penryn-v1 Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Client-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Skylake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Snowridge-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Westmere Feb 1 04:23:43 localhost nova_compute[224665]: Westmere-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Westmere-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Westmere-v2 Feb 1 04:23:43 localhost nova_compute[224665]: athlon Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: athlon-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: core2duo Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: core2duo-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: coreduo Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: coreduo-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: kvm32 Feb 1 04:23:43 localhost nova_compute[224665]: kvm32-v1 Feb 1 04:23:43 localhost nova_compute[224665]: kvm64 Feb 1 04:23:43 localhost nova_compute[224665]: kvm64-v1 Feb 1 04:23:43 localhost nova_compute[224665]: n270 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: n270-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: pentium Feb 1 04:23:43 localhost nova_compute[224665]: pentium-v1 Feb 1 04:23:43 localhost nova_compute[224665]: pentium2 Feb 1 04:23:43 localhost nova_compute[224665]: pentium2-v1 Feb 1 04:23:43 localhost nova_compute[224665]: pentium3 Feb 1 04:23:43 localhost nova_compute[224665]: pentium3-v1 Feb 1 04:23:43 localhost nova_compute[224665]: phenom Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: phenom-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: qemu32 Feb 1 04:23:43 localhost nova_compute[224665]: qemu32-v1 Feb 1 04:23:43 localhost nova_compute[224665]: qemu64 Feb 1 04:23:43 localhost nova_compute[224665]: qemu64-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: file Feb 1 04:23:43 localhost nova_compute[224665]: anonymous Feb 1 04:23:43 localhost nova_compute[224665]: memfd Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: disk Feb 1 04:23:43 localhost nova_compute[224665]: cdrom Feb 1 04:23:43 localhost nova_compute[224665]: floppy Feb 1 04:23:43 localhost nova_compute[224665]: lun Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: fdc Feb 1 04:23:43 localhost nova_compute[224665]: scsi Feb 1 04:23:43 localhost nova_compute[224665]: virtio Feb 1 04:23:43 localhost nova_compute[224665]: usb Feb 1 04:23:43 localhost nova_compute[224665]: sata Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: virtio Feb 1 04:23:43 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:43 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: vnc Feb 1 04:23:43 localhost nova_compute[224665]: egl-headless Feb 1 04:23:43 localhost nova_compute[224665]: dbus Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: subsystem Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: default Feb 1 04:23:43 localhost nova_compute[224665]: mandatory Feb 1 04:23:43 localhost nova_compute[224665]: requisite Feb 1 04:23:43 localhost nova_compute[224665]: optional Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: usb Feb 1 04:23:43 localhost nova_compute[224665]: pci Feb 1 04:23:43 localhost nova_compute[224665]: scsi Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: virtio Feb 1 04:23:43 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:43 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: random Feb 1 04:23:43 localhost nova_compute[224665]: egd Feb 1 04:23:43 localhost nova_compute[224665]: builtin Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: path Feb 1 04:23:43 localhost nova_compute[224665]: handle Feb 1 04:23:43 localhost nova_compute[224665]: virtiofs Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: tpm-tis Feb 1 04:23:43 localhost nova_compute[224665]: tpm-crb Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: emulator Feb 1 04:23:43 localhost nova_compute[224665]: external Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 2.0 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: usb Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: pty Feb 1 04:23:43 localhost nova_compute[224665]: unix Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: qemu Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: builtin Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: default Feb 1 04:23:43 localhost nova_compute[224665]: passt Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: isa Feb 1 04:23:43 localhost nova_compute[224665]: hyperv Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: null Feb 1 04:23:43 localhost nova_compute[224665]: vc Feb 1 04:23:43 localhost nova_compute[224665]: pty Feb 1 04:23:43 localhost nova_compute[224665]: dev Feb 1 04:23:43 localhost nova_compute[224665]: file Feb 1 04:23:43 localhost nova_compute[224665]: pipe Feb 1 04:23:43 localhost nova_compute[224665]: stdio Feb 1 04:23:43 localhost nova_compute[224665]: udp Feb 1 04:23:43 localhost nova_compute[224665]: tcp Feb 1 04:23:43 localhost nova_compute[224665]: unix Feb 1 04:23:43 localhost nova_compute[224665]: qemu-vdagent Feb 1 04:23:43 localhost nova_compute[224665]: dbus Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: relaxed Feb 1 04:23:43 localhost nova_compute[224665]: vapic Feb 1 04:23:43 localhost nova_compute[224665]: spinlocks Feb 1 04:23:43 localhost nova_compute[224665]: vpindex Feb 1 04:23:43 localhost nova_compute[224665]: runtime Feb 1 04:23:43 localhost nova_compute[224665]: synic Feb 1 04:23:43 localhost nova_compute[224665]: stimer Feb 1 04:23:43 localhost nova_compute[224665]: reset Feb 1 04:23:43 localhost nova_compute[224665]: vendor_id Feb 1 04:23:43 localhost nova_compute[224665]: frequencies Feb 1 04:23:43 localhost nova_compute[224665]: reenlightenment Feb 1 04:23:43 localhost nova_compute[224665]: tlbflush Feb 1 04:23:43 localhost nova_compute[224665]: ipi Feb 1 04:23:43 localhost nova_compute[224665]: avic Feb 1 04:23:43 localhost nova_compute[224665]: emsr_bitmap Feb 1 04:23:43 localhost nova_compute[224665]: xmm_input Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 4095 Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Linux KVM Hv Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.838 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 1 04:23:43 localhost nova_compute[224665]: 2026-02-01 09:23:43.846 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: /usr/libexec/qemu-kvm Feb 1 04:23:43 localhost nova_compute[224665]: kvm Feb 1 04:23:43 localhost nova_compute[224665]: pc-i440fx-rhel7.6.0 Feb 1 04:23:43 localhost nova_compute[224665]: x86_64 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: rom Feb 1 04:23:43 localhost nova_compute[224665]: pflash Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: yes Feb 1 04:23:43 localhost nova_compute[224665]: no Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: no Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: on Feb 1 04:23:43 localhost nova_compute[224665]: off Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:43 localhost nova_compute[224665]: AMD Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: 486 Feb 1 04:23:43 localhost nova_compute[224665]: 486-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Broadwell-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cascadelake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: ClearwaterForest Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: ClearwaterForest-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Conroe Feb 1 04:23:43 localhost nova_compute[224665]: Conroe-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Cooperlake-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Denverton-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Dhyana-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Genoa-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-IBPB Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Milan-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v4 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Rome-v5 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Turin Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-Turin-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v1 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v2 Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: EPYC-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: GraniteRapids-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-noTSX-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Haswell-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-noTSX Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v6 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Icelake-Server-v7 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: IvyBridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: KnightsMill Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: KnightsMill-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Nehalem-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G1-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G2 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G2-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G3 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G3-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G4-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G5 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Opteron_G5-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Penryn Feb 1 04:23:43 localhost nova_compute[224665]: Penryn-v1 Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-IBRS Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-v1 Feb 1 04:23:43 localhost nova_compute[224665]: SandyBridge-v2 Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v1 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v2 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v3 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SapphireRapids-v4 Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: SierraForest Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:43 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SierraForest-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SierraForest-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SierraForest-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-noTSX-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-noTSX-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v5 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Westmere Feb 1 04:23:44 localhost nova_compute[224665]: Westmere-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Westmere-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Westmere-v2 Feb 1 04:23:44 localhost nova_compute[224665]: athlon Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: athlon-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: core2duo Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: core2duo-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: coreduo Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: coreduo-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: kvm32 Feb 1 04:23:44 localhost nova_compute[224665]: kvm32-v1 Feb 1 04:23:44 localhost nova_compute[224665]: kvm64 Feb 1 04:23:44 localhost nova_compute[224665]: kvm64-v1 Feb 1 04:23:44 localhost nova_compute[224665]: n270 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: n270-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: pentium Feb 1 04:23:44 localhost nova_compute[224665]: pentium-v1 Feb 1 04:23:44 localhost nova_compute[224665]: pentium2 Feb 1 04:23:44 localhost nova_compute[224665]: pentium2-v1 Feb 1 04:23:44 localhost nova_compute[224665]: pentium3 Feb 1 04:23:44 localhost nova_compute[224665]: pentium3-v1 Feb 1 04:23:44 localhost nova_compute[224665]: phenom Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: phenom-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: qemu32 Feb 1 04:23:44 localhost nova_compute[224665]: qemu32-v1 Feb 1 04:23:44 localhost nova_compute[224665]: qemu64 Feb 1 04:23:44 localhost nova_compute[224665]: qemu64-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: file Feb 1 04:23:44 localhost nova_compute[224665]: anonymous Feb 1 04:23:44 localhost nova_compute[224665]: memfd Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: disk Feb 1 04:23:44 localhost nova_compute[224665]: cdrom Feb 1 04:23:44 localhost nova_compute[224665]: floppy Feb 1 04:23:44 localhost nova_compute[224665]: lun Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: ide Feb 1 04:23:44 localhost nova_compute[224665]: fdc Feb 1 04:23:44 localhost nova_compute[224665]: scsi Feb 1 04:23:44 localhost nova_compute[224665]: virtio Feb 1 04:23:44 localhost nova_compute[224665]: usb Feb 1 04:23:44 localhost nova_compute[224665]: sata Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: virtio Feb 1 04:23:44 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:44 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: vnc Feb 1 04:23:44 localhost nova_compute[224665]: egl-headless Feb 1 04:23:44 localhost nova_compute[224665]: dbus Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: subsystem Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: default Feb 1 04:23:44 localhost nova_compute[224665]: mandatory Feb 1 04:23:44 localhost nova_compute[224665]: requisite Feb 1 04:23:44 localhost nova_compute[224665]: optional Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: usb Feb 1 04:23:44 localhost nova_compute[224665]: pci Feb 1 04:23:44 localhost nova_compute[224665]: scsi Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: virtio Feb 1 04:23:44 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:44 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: random Feb 1 04:23:44 localhost nova_compute[224665]: egd Feb 1 04:23:44 localhost nova_compute[224665]: builtin Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: path Feb 1 04:23:44 localhost nova_compute[224665]: handle Feb 1 04:23:44 localhost nova_compute[224665]: virtiofs Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: tpm-tis Feb 1 04:23:44 localhost nova_compute[224665]: tpm-crb Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: emulator Feb 1 04:23:44 localhost nova_compute[224665]: external Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: 2.0 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: usb Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: pty Feb 1 04:23:44 localhost nova_compute[224665]: unix Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: qemu Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: builtin Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: default Feb 1 04:23:44 localhost nova_compute[224665]: passt Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: isa Feb 1 04:23:44 localhost nova_compute[224665]: hyperv Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: null Feb 1 04:23:44 localhost nova_compute[224665]: vc Feb 1 04:23:44 localhost nova_compute[224665]: pty Feb 1 04:23:44 localhost nova_compute[224665]: dev Feb 1 04:23:44 localhost nova_compute[224665]: file Feb 1 04:23:44 localhost nova_compute[224665]: pipe Feb 1 04:23:44 localhost nova_compute[224665]: stdio Feb 1 04:23:44 localhost nova_compute[224665]: udp Feb 1 04:23:44 localhost nova_compute[224665]: tcp Feb 1 04:23:44 localhost nova_compute[224665]: unix Feb 1 04:23:44 localhost nova_compute[224665]: qemu-vdagent Feb 1 04:23:44 localhost nova_compute[224665]: dbus Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: relaxed Feb 1 04:23:44 localhost nova_compute[224665]: vapic Feb 1 04:23:44 localhost nova_compute[224665]: spinlocks Feb 1 04:23:44 localhost nova_compute[224665]: vpindex Feb 1 04:23:44 localhost nova_compute[224665]: runtime Feb 1 04:23:44 localhost nova_compute[224665]: synic Feb 1 04:23:44 localhost nova_compute[224665]: stimer Feb 1 04:23:44 localhost nova_compute[224665]: reset Feb 1 04:23:44 localhost nova_compute[224665]: vendor_id Feb 1 04:23:44 localhost nova_compute[224665]: frequencies Feb 1 04:23:44 localhost nova_compute[224665]: reenlightenment Feb 1 04:23:44 localhost nova_compute[224665]: tlbflush Feb 1 04:23:44 localhost nova_compute[224665]: ipi Feb 1 04:23:44 localhost nova_compute[224665]: avic Feb 1 04:23:44 localhost nova_compute[224665]: emsr_bitmap Feb 1 04:23:44 localhost nova_compute[224665]: xmm_input Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: 4095 Feb 1 04:23:44 localhost nova_compute[224665]: on Feb 1 04:23:44 localhost nova_compute[224665]: off Feb 1 04:23:44 localhost nova_compute[224665]: off Feb 1 04:23:44 localhost nova_compute[224665]: Linux KVM Hv Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:43.921 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: /usr/libexec/qemu-kvm Feb 1 04:23:44 localhost nova_compute[224665]: kvm Feb 1 04:23:44 localhost nova_compute[224665]: pc-q35-rhel9.8.0 Feb 1 04:23:44 localhost nova_compute[224665]: x86_64 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: efi Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 1 04:23:44 localhost nova_compute[224665]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 1 04:23:44 localhost nova_compute[224665]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 1 04:23:44 localhost nova_compute[224665]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: rom Feb 1 04:23:44 localhost nova_compute[224665]: pflash Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: yes Feb 1 04:23:44 localhost nova_compute[224665]: no Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: yes Feb 1 04:23:44 localhost nova_compute[224665]: no Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: on Feb 1 04:23:44 localhost nova_compute[224665]: off Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: on Feb 1 04:23:44 localhost nova_compute[224665]: off Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:44 localhost nova_compute[224665]: AMD Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: 486 Feb 1 04:23:44 localhost nova_compute[224665]: 486-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell-noTSX Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell-noTSX-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Broadwell-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cascadelake-Server Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cascadelake-Server-noTSX Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cascadelake-Server-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cascadelake-Server-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cascadelake-Server-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cascadelake-Server-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cascadelake-Server-v5 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: ClearwaterForest Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: ClearwaterForest-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Conroe Feb 1 04:23:44 localhost nova_compute[224665]: Conroe-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Cooperlake Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cooperlake-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Cooperlake-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Denverton Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Denverton-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Denverton-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Denverton-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Dhyana Feb 1 04:23:44 localhost nova_compute[224665]: Dhyana-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Dhyana-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Genoa Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Genoa-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Genoa-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-IBPB Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Milan Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Milan-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Milan-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Milan-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Rome Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Rome-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Rome-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Rome-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Rome-v4 Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Rome-v5 Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Turin Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-Turin-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-v1 Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-v2 Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: EPYC-v5 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: GraniteRapids Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: GraniteRapids-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: GraniteRapids-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: GraniteRapids-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell-noTSX Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell-noTSX-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Haswell-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-noTSX Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-v5 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-v6 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Icelake-Server-v7 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: IvyBridge Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: IvyBridge-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: IvyBridge-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: IvyBridge-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: KnightsMill Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: KnightsMill-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Nehalem Feb 1 04:23:44 localhost nova_compute[224665]: Nehalem-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Nehalem-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Nehalem-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G1 Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G1-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G2 Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G2-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G3 Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G3-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G4-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G5 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Opteron_G5-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Penryn Feb 1 04:23:44 localhost nova_compute[224665]: Penryn-v1 Feb 1 04:23:44 localhost nova_compute[224665]: SandyBridge Feb 1 04:23:44 localhost nova_compute[224665]: SandyBridge-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: SandyBridge-v1 Feb 1 04:23:44 localhost nova_compute[224665]: SandyBridge-v2 Feb 1 04:23:44 localhost nova_compute[224665]: SapphireRapids Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SapphireRapids-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SapphireRapids-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SapphireRapids-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SapphireRapids-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SierraForest Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SierraForest-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SierraForest-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: SierraForest-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-noTSX-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Client-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-noTSX-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Skylake-Server-v5 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v2 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v3 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Snowridge-v4 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Westmere Feb 1 04:23:44 localhost nova_compute[224665]: Westmere-IBRS Feb 1 04:23:44 localhost nova_compute[224665]: Westmere-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Westmere-v2 Feb 1 04:23:44 localhost nova_compute[224665]: athlon Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: athlon-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: core2duo Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: core2duo-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: coreduo Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: coreduo-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: kvm32 Feb 1 04:23:44 localhost nova_compute[224665]: kvm32-v1 Feb 1 04:23:44 localhost nova_compute[224665]: kvm64 Feb 1 04:23:44 localhost nova_compute[224665]: kvm64-v1 Feb 1 04:23:44 localhost nova_compute[224665]: n270 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: n270-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: pentium Feb 1 04:23:44 localhost nova_compute[224665]: pentium-v1 Feb 1 04:23:44 localhost nova_compute[224665]: pentium2 Feb 1 04:23:44 localhost nova_compute[224665]: pentium2-v1 Feb 1 04:23:44 localhost nova_compute[224665]: pentium3 Feb 1 04:23:44 localhost nova_compute[224665]: pentium3-v1 Feb 1 04:23:44 localhost nova_compute[224665]: phenom Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: phenom-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: qemu32 Feb 1 04:23:44 localhost nova_compute[224665]: qemu32-v1 Feb 1 04:23:44 localhost nova_compute[224665]: qemu64 Feb 1 04:23:44 localhost nova_compute[224665]: qemu64-v1 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: file Feb 1 04:23:44 localhost nova_compute[224665]: anonymous Feb 1 04:23:44 localhost nova_compute[224665]: memfd Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: disk Feb 1 04:23:44 localhost nova_compute[224665]: cdrom Feb 1 04:23:44 localhost nova_compute[224665]: floppy Feb 1 04:23:44 localhost nova_compute[224665]: lun Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: fdc Feb 1 04:23:44 localhost nova_compute[224665]: scsi Feb 1 04:23:44 localhost nova_compute[224665]: virtio Feb 1 04:23:44 localhost nova_compute[224665]: usb Feb 1 04:23:44 localhost nova_compute[224665]: sata Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: virtio Feb 1 04:23:44 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:44 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: vnc Feb 1 04:23:44 localhost nova_compute[224665]: egl-headless Feb 1 04:23:44 localhost nova_compute[224665]: dbus Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: subsystem Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: default Feb 1 04:23:44 localhost nova_compute[224665]: mandatory Feb 1 04:23:44 localhost nova_compute[224665]: requisite Feb 1 04:23:44 localhost nova_compute[224665]: optional Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: usb Feb 1 04:23:44 localhost nova_compute[224665]: pci Feb 1 04:23:44 localhost nova_compute[224665]: scsi Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: virtio Feb 1 04:23:44 localhost nova_compute[224665]: virtio-transitional Feb 1 04:23:44 localhost nova_compute[224665]: virtio-non-transitional Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: random Feb 1 04:23:44 localhost nova_compute[224665]: egd Feb 1 04:23:44 localhost nova_compute[224665]: builtin Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: path Feb 1 04:23:44 localhost nova_compute[224665]: handle Feb 1 04:23:44 localhost nova_compute[224665]: virtiofs Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: tpm-tis Feb 1 04:23:44 localhost nova_compute[224665]: tpm-crb Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: emulator Feb 1 04:23:44 localhost nova_compute[224665]: external Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: 2.0 Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: usb Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: pty Feb 1 04:23:44 localhost nova_compute[224665]: unix Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: qemu Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: builtin Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: default Feb 1 04:23:44 localhost nova_compute[224665]: passt Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: isa Feb 1 04:23:44 localhost nova_compute[224665]: hyperv Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: null Feb 1 04:23:44 localhost nova_compute[224665]: vc Feb 1 04:23:44 localhost nova_compute[224665]: pty Feb 1 04:23:44 localhost nova_compute[224665]: dev Feb 1 04:23:44 localhost nova_compute[224665]: file Feb 1 04:23:44 localhost nova_compute[224665]: pipe Feb 1 04:23:44 localhost nova_compute[224665]: stdio Feb 1 04:23:44 localhost nova_compute[224665]: udp Feb 1 04:23:44 localhost nova_compute[224665]: tcp Feb 1 04:23:44 localhost nova_compute[224665]: unix Feb 1 04:23:44 localhost nova_compute[224665]: qemu-vdagent Feb 1 04:23:44 localhost nova_compute[224665]: dbus Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: relaxed Feb 1 04:23:44 localhost nova_compute[224665]: vapic Feb 1 04:23:44 localhost nova_compute[224665]: spinlocks Feb 1 04:23:44 localhost nova_compute[224665]: vpindex Feb 1 04:23:44 localhost nova_compute[224665]: runtime Feb 1 04:23:44 localhost nova_compute[224665]: synic Feb 1 04:23:44 localhost nova_compute[224665]: stimer Feb 1 04:23:44 localhost nova_compute[224665]: reset Feb 1 04:23:44 localhost nova_compute[224665]: vendor_id Feb 1 04:23:44 localhost nova_compute[224665]: frequencies Feb 1 04:23:44 localhost nova_compute[224665]: reenlightenment Feb 1 04:23:44 localhost nova_compute[224665]: tlbflush Feb 1 04:23:44 localhost nova_compute[224665]: ipi Feb 1 04:23:44 localhost nova_compute[224665]: avic Feb 1 04:23:44 localhost nova_compute[224665]: emsr_bitmap Feb 1 04:23:44 localhost nova_compute[224665]: xmm_input Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: 4095 Feb 1 04:23:44 localhost nova_compute[224665]: on Feb 1 04:23:44 localhost nova_compute[224665]: off Feb 1 04:23:44 localhost nova_compute[224665]: off Feb 1 04:23:44 localhost nova_compute[224665]: Linux KVM Hv Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: Feb 1 04:23:44 localhost nova_compute[224665]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.006 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.006 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.011 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.011 224669 INFO nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Secure Boot support detected#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.013 224669 INFO nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.014 224669 INFO nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.023 224669 DEBUG nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.071 224669 INFO nova.virt.node [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Determined node identity a04bda90-8ccd-4104-8518-038544ff1327 from /var/lib/nova/compute_id#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.092 224669 DEBUG nova.compute.manager [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Verified node a04bda90-8ccd-4104-8518-038544ff1327 matches my host np0005604212.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.134 224669 DEBUG nova.compute.manager [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.139 224669 DEBUG nova.virt.libvirt.vif [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-01T08:24:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005604212.localdomain',hostname='test',id=2,image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-01T08:24:22Z,launched_on='np0005604212.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005604212.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='79df39cba1c14309b68e8b61518619fd',ramdisk_id='',reservation_id='r-pgkx81ko',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-02-01T08:24:22Z,user_data=None,user_id='7567a560936c417c92d242d856b00bb3',uuid=08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.139 224669 DEBUG nova.network.os_vif_util [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Converting VIF {"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.140 224669 DEBUG nova.network.os_vif_util [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.142 224669 DEBUG os_vif [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.225 224669 DEBUG ovsdbapp.backend.ovs_idl [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.225 224669 DEBUG ovsdbapp.backend.ovs_idl [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.225 224669 DEBUG ovsdbapp.backend.ovs_idl [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.226 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.226 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.226 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.227 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.228 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.232 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.248 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.249 224669 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.249 224669 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.250 224669 INFO oslo.privsep.daemon [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpr12erx_r/privsep.sock']#033[00m Feb 1 04:23:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18425 DF PROTO=TCP SPT=55774 DPT=9100 SEQ=907060204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFCEA380000000001030307) Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.851 224669 INFO oslo.privsep.daemon [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.729 224998 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.734 224998 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.737 224998 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Feb 1 04:23:44 localhost nova_compute[224665]: 2026-02-01 09:23:44.738 224998 INFO oslo.privsep.daemon [-] privsep daemon running as pid 224998#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.158 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.159 224669 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09cac1be-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.159 224669 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09cac1be-46, col_values=(('external_ids', {'iface-id': '09cac1be-46e2-4a31-8306-e6f4f0401b19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:11:63', 'vm-uuid': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.160 224669 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.161 224669 INFO os_vif [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46')#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.162 224669 DEBUG nova.compute.manager [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:23:45 localhost python3.9[225038]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.171 224669 DEBUG nova.compute.manager [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.172 224669 INFO nova.compute.manager [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Feb 1 04:23:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54276 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=3983246002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFCEDB90000000001030307) Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.654 224669 INFO nova.service [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updating service version for nova-compute on np0005604212.localdomain from 57 to 66#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.681 224669 DEBUG oslo_concurrency.lockutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.682 224669 DEBUG oslo_concurrency.lockutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.682 224669 DEBUG oslo_concurrency.lockutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.682 224669 DEBUG nova.compute.resource_tracker [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:23:45 localhost nova_compute[224665]: 2026-02-01 09:23:45.683 224669 DEBUG oslo_concurrency.processutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.153 224669 DEBUG oslo_concurrency.processutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.225 224669 DEBUG nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.226 224669 DEBUG nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.391 224669 WARNING nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.392 224669 DEBUG nova.compute.resource_tracker [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12909MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.393 224669 DEBUG oslo_concurrency.lockutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.393 224669 DEBUG oslo_concurrency.lockutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.603 224669 DEBUG nova.compute.resource_tracker [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.604 224669 DEBUG nova.compute.resource_tracker [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.604 224669 DEBUG nova.compute.resource_tracker [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.669 224669 DEBUG nova.scheduler.client.report [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.691 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.700 224669 DEBUG nova.scheduler.client.report [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.700 224669 DEBUG nova.compute.provider_tree [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.721 224669 DEBUG nova.scheduler.client.report [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.754 224669 DEBUG nova.scheduler.client.report [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: HW_CPU_X86_SHA,HW_CPU_X86_MMX,HW_CPU_X86_AMD_SVM,COMPUTE_NODE,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_USB,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,HW_CPU_X86_AVX2,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_AESNI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_FMA3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:23:46 localhost nova_compute[224665]: 2026-02-01 09:23:46.799 224669 DEBUG oslo_concurrency.processutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.246 224669 DEBUG oslo_concurrency.processutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.252 224669 DEBUG nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Feb 1 04:23:47 localhost nova_compute[224665]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.253 224669 INFO nova.virt.libvirt.host [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] kernel doesn't support AMD SEV#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.254 224669 DEBUG nova.compute.provider_tree [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.255 224669 DEBUG nova.virt.libvirt.driver [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.328 224669 DEBUG nova.scheduler.client.report [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updated inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.329 224669 DEBUG nova.compute.provider_tree [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updating resource provider a04bda90-8ccd-4104-8518-038544ff1327 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.330 224669 DEBUG nova.compute.provider_tree [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.451 224669 DEBUG nova.compute.provider_tree [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Updating resource provider a04bda90-8ccd-4104-8518-038544ff1327 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.499 224669 DEBUG nova.compute.resource_tracker [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.499 224669 DEBUG oslo_concurrency.lockutils [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.500 224669 DEBUG nova.service [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.558 224669 DEBUG nova.service [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Feb 1 04:23:47 localhost nova_compute[224665]: 2026-02-01 09:23:47.559 224669 DEBUG nova.servicegroup.drivers.db [None req-344b290a-05fc-469d-9c54-0e78530f0c3c - - - - - -] DB_Driver: join new ServiceGroup member np0005604212.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Feb 1 04:23:48 localhost python3.9[225371]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 1 04:23:49 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 102.4 (341 of 333 items), suggesting rotation. Feb 1 04:23:49 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:23:49 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:23:49 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:23:49 localhost nova_compute[224665]: 2026-02-01 09:23:49.276 224669 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:23:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:23:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:23:49 localhost systemd[1]: tmp-crun.aO9pQ9.mount: Deactivated successfully. Feb 1 04:23:49 localhost podman[225530]: 2026-02-01 09:23:49.762591729 +0000 UTC m=+0.135455095 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 1 04:23:49 localhost podman[225531]: 2026-02-01 09:23:49.778166558 +0000 UTC m=+0.150403675 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 1 04:23:49 localhost podman[225530]: 2026-02-01 09:23:49.793701704 +0000 UTC m=+0.166565060 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:23:49 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:23:49 localhost podman[225531]: 2026-02-01 09:23:49.843292155 +0000 UTC m=+0.215529252 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:23:49 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:23:49 localhost python3.9[225529]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:23:49 localhost systemd[1]: Stopping nova_compute container... Feb 1 04:23:50 localhost systemd[1]: libpod-eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d.scope: Deactivated successfully. Feb 1 04:23:50 localhost journal[202460]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, ) Feb 1 04:23:50 localhost journal[202460]: hostname: np0005604212.localdomain Feb 1 04:23:50 localhost journal[202460]: End of file while reading data: Input/output error Feb 1 04:23:50 localhost systemd[1]: libpod-eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d.scope: Consumed 4.487s CPU time. Feb 1 04:23:50 localhost podman[225573]: 2026-02-01 09:23:50.061426586 +0000 UTC m=+0.083082650 container died eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:23:50 localhost podman[225573]: 2026-02-01 09:23:50.144040529 +0000 UTC m=+0.165696623 container cleanup eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS) Feb 1 04:23:50 localhost podman[225573]: nova_compute Feb 1 04:23:50 localhost podman[225614]: error opening file `/run/crun/eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d/status`: No such file or directory Feb 1 04:23:50 localhost podman[225603]: 2026-02-01 09:23:50.238172347 +0000 UTC m=+0.067161521 container cleanup eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:23:50 localhost podman[225603]: nova_compute Feb 1 04:23:50 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Feb 1 04:23:50 localhost systemd[1]: Stopped nova_compute container. Feb 1 04:23:50 localhost systemd[1]: Starting nova_compute container... Feb 1 04:23:50 localhost systemd[1]: Started libcrun container. Feb 1 04:23:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 04:23:50 localhost podman[225616]: 2026-02-01 09:23:50.388149627 +0000 UTC m=+0.116845224 container init eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2) Feb 1 04:23:50 localhost podman[225616]: 2026-02-01 09:23:50.397760352 +0000 UTC m=+0.126455949 container start eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute) Feb 1 04:23:50 localhost podman[225616]: nova_compute Feb 1 04:23:50 localhost nova_compute[225632]: + sudo -E kolla_set_configs Feb 1 04:23:50 localhost systemd[1]: Started nova_compute container. Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Validating config file Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying service configuration files Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /etc/ceph Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Creating directory /etc/ceph Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/ceph Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Writing out command to execute Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:23:50 localhost nova_compute[225632]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 1 04:23:50 localhost nova_compute[225632]: ++ cat /run_command Feb 1 04:23:50 localhost nova_compute[225632]: + CMD=nova-compute Feb 1 04:23:50 localhost nova_compute[225632]: + ARGS= Feb 1 04:23:50 localhost nova_compute[225632]: + sudo kolla_copy_cacerts Feb 1 04:23:50 localhost nova_compute[225632]: + [[ ! -n '' ]] Feb 1 04:23:50 localhost nova_compute[225632]: + . kolla_extend_start Feb 1 04:23:50 localhost nova_compute[225632]: Running command: 'nova-compute' Feb 1 04:23:50 localhost nova_compute[225632]: + echo 'Running command: '\''nova-compute'\''' Feb 1 04:23:50 localhost nova_compute[225632]: + umask 0022 Feb 1 04:23:50 localhost nova_compute[225632]: + exec nova-compute Feb 1 04:23:50 localhost systemd[1]: tmp-crun.LdTsH2.mount: Deactivated successfully. Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.253 225636 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.254 225636 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.254 225636 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.254 225636 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 1 04:23:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18426 DF PROTO=TCP SPT=55774 DPT=9100 SEQ=907060204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFD09B80000000001030307) Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.377 225636 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.398 225636 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.398 225636 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.769 225636 INFO nova.virt.driver [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.887 225636 INFO nova.compute.provider_config [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.895 225636 WARNING nova.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.896 225636 DEBUG oslo_concurrency.lockutils [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.896 225636 DEBUG oslo_concurrency.lockutils [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.896 225636 DEBUG oslo_concurrency.lockutils [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.897 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.897 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.897 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.897 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.897 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.898 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.898 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.898 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.898 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.898 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.899 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.899 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.899 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.899 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.899 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.900 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.900 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.900 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.900 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.900 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] console_host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.901 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.901 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.901 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.901 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.901 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.902 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.902 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.902 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.902 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.902 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.903 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.903 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.903 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.903 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.903 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.904 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.904 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.904 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.904 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.904 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.905 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.905 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.905 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.905 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.905 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.906 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.906 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.906 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.906 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.906 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.907 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.907 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.907 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.907 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.907 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.908 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.908 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.908 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.908 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.908 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.908 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.909 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.909 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.909 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.909 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.909 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.910 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.910 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.910 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.910 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.910 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.910 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.911 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.911 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.911 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.911 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.911 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.912 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.912 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.912 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.912 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.912 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.913 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.913 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.913 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.913 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38743 DF PROTO=TCP SPT=40284 DPT=9102 SEQ=920104227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1CFD0BDC0000000001030307) Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.913 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.914 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.914 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.914 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.914 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.914 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.915 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.915 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.915 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.915 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.915 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.915 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.916 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.916 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.916 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.916 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.916 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.917 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.917 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.917 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.917 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.917 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.917 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.918 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.918 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.918 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.918 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.918 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.919 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.919 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.919 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.919 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.919 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.920 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.920 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.920 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.920 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.920 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.921 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.921 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.921 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.921 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.921 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.921 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.922 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.922 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.922 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.922 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.922 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.923 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.923 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.923 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.923 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.923 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.924 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.924 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.924 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.924 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.924 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.924 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.925 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.925 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.925 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.925 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.925 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.926 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.926 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.926 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.926 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.926 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.927 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.927 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.927 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.927 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.927 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.928 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.928 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.928 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.928 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.928 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.929 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.929 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.929 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.929 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.929 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.930 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.930 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.930 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.930 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.930 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.930 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.931 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.931 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.931 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.931 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.932 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.932 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.932 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.932 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.932 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.932 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.933 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.933 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.933 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.933 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.933 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.934 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.934 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.934 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.934 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.934 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.935 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.935 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.935 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.935 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.935 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.936 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.936 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.936 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.936 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.936 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.936 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.937 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.937 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.937 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.937 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.937 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.938 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.938 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.938 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.938 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.938 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.939 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.939 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.939 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.939 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.939 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.940 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.940 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.940 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.940 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.940 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.941 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.941 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.941 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.941 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.941 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.942 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.942 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.942 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.942 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.942 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.943 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.943 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.943 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.943 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.943 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.944 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.944 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.944 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.944 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.944 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.945 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.945 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.945 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.945 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.945 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.945 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.946 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.946 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.946 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.946 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.946 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.947 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.947 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.947 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.947 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.947 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.948 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.948 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.948 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.948 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.948 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.949 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.949 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.949 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.949 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.949 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.950 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.950 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.950 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.950 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.950 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.951 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.951 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.951 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.951 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.951 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.951 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.952 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.952 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.952 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.952 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.952 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.952 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.952 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.953 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.954 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.954 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.954 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.954 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.954 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.954 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.954 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.955 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.955 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.955 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.955 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.955 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.955 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.956 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.957 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.957 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.957 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.957 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.957 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.957 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.957 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.958 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.958 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.958 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.958 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.958 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.958 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.958 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.959 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.959 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.959 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.959 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.959 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.959 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.959 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.960 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.961 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.961 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.961 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.961 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.961 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.961 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.962 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.962 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.962 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.962 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.962 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.962 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.963 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.964 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.964 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.964 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.964 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.964 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.964 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.964 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.965 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.965 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.965 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.965 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.965 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.965 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.965 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.966 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.966 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.966 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.966 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.966 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.966 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.966 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.967 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.968 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.968 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.968 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.968 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.968 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.968 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.968 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.969 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.969 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.969 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.969 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.969 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.969 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.969 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.970 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.971 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.971 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.971 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.971 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.971 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.971 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.971 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.972 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.972 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.972 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.972 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.972 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.972 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.972 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.973 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.974 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.974 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.974 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.974 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.974 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.974 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.974 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.975 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.975 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.975 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.975 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.975 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.975 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.975 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.976 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.976 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.976 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.976 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.976 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.976 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.976 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.977 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.978 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.978 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.978 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.978 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.978 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.978 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.979 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.979 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.979 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.979 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.979 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.979 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.979 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.980 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.980 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.980 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.980 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.980 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.980 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.980 225636 WARNING oslo_config.cfg [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 1 04:23:52 localhost nova_compute[225632]: live_migration_uri is deprecated for removal in favor of two other options that Feb 1 04:23:52 localhost nova_compute[225632]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 1 04:23:52 localhost nova_compute[225632]: and ``live_migration_inbound_addr`` respectively. Feb 1 04:23:52 localhost nova_compute[225632]: ). Its value may be silently ignored in the future.#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.981 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.981 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.981 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.981 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.981 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.981 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.982 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.982 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.982 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.982 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.982 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.982 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.982 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.983 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.983 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.983 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.983 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.983 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.983 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rbd_secret_uuid = 33fac0b9-80c7-560f-918a-c92d3021ca1e log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.983 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.984 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.984 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.984 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.984 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.984 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.984 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.984 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.985 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.985 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.985 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.985 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.985 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.985 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.985 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.986 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.986 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.986 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.986 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.986 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.986 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.986 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.987 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.987 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.987 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.987 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.987 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.987 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.987 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.988 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.988 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.988 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.988 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.988 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.988 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.988 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.989 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.989 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.989 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.989 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.989 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.989 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.989 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.990 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.990 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.990 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.990 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.990 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.990 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.990 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.991 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.991 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.991 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.991 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.991 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.991 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.991 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.992 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.992 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.992 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.992 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.992 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.992 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.992 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.993 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.994 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.994 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.994 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.994 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.994 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.994 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.994 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.995 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.995 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.995 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.995 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.995 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.995 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.995 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.996 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.997 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.997 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.997 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.997 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.997 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.997 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.997 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.998 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.998 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.998 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.998 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.998 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.998 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.998 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.999 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.999 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.999 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.999 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.999 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:52 localhost nova_compute[225632]: 2026-02-01 09:23:52.999 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:52.999 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.000 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.000 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.000 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.000 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.000 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.000 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.001 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.001 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.001 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.001 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.001 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.001 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.001 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.002 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.002 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.002 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.002 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.002 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.002 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.002 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.003 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.003 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.003 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.003 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.003 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.003 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.003 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.004 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.005 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.005 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.005 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.005 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.005 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.005 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.005 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.006 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.006 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.006 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.006 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.006 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.006 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.006 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.007 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.007 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.007 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.007 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.007 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.008 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.008 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.008 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.008 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.008 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.008 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.008 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.009 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.009 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.009 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.009 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.009 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.009 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.009 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.010 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.010 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.010 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.010 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.010 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.010 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.010 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.011 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.011 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.011 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.011 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.011 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.011 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.011 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.012 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.013 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.014 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.014 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.014 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.014 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.014 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.014 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.014 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.015 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.016 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.016 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.016 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.016 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.016 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.016 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.017 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.017 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.017 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.017 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.017 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.017 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.017 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.018 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.019 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.019 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.019 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.019 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.019 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.019 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.019 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.020 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.021 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.021 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.021 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.021 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.022 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.023 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.023 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.023 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.024 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.024 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.024 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.025 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.025 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.026 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.026 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.026 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.026 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.027 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.027 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.028 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.028 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.028 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.028 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.029 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.029 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.029 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.030 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.030 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.030 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.031 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.031 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.031 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.031 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.031 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.032 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.032 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.032 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.032 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.033 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.033 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.033 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.034 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.034 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.034 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.034 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.035 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.035 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.035 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.036 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.036 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.036 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.036 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.037 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.037 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.037 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.038 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.038 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.038 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.039 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.039 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.039 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.040 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.040 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.040 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.040 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.041 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.041 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.041 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.042 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.042 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.042 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.042 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.043 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.043 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.043 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.044 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.044 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.044 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.045 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.045 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.045 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.046 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.046 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.046 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.047 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.047 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.047 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.048 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.048 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.048 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.049 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.049 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.049 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.050 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.050 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.050 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.051 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.051 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.051 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.052 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.052 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.052 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.053 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.053 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.053 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.054 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.054 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.054 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.055 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.055 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.055 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.056 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.056 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.056 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.057 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.057 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.057 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.058 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.058 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.058 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.059 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.059 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.059 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.060 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.060 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.060 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.061 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.061 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.061 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.062 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.062 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.062 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.063 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.063 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.063 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.064 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.064 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.064 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.064 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.065 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.065 225636 DEBUG oslo_service.service [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.069 225636 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.096 225636 INFO nova.virt.node [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Determined node identity a04bda90-8ccd-4104-8518-038544ff1327 from /var/lib/nova/compute_id#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.097 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.098 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.098 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.123 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.134 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.137 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.138 225636 INFO nova.virt.libvirt.driver [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Connection event '1' reason 'None'#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.143 225636 INFO nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Libvirt host capabilities Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 9037fad6-143b-4373-b625-f89bce657827 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: x86_64 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v4 Feb 1 04:23:53 localhost nova_compute[225632]: AMD Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: tcp Feb 1 04:23:53 localhost nova_compute[225632]: rdma Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 16116612 Feb 1 04:23:53 localhost nova_compute[225632]: 4029153 Feb 1 04:23:53 localhost nova_compute[225632]: 0 Feb 1 04:23:53 localhost nova_compute[225632]: 0 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: selinux Feb 1 04:23:53 localhost nova_compute[225632]: 0 Feb 1 04:23:53 localhost nova_compute[225632]: system_u:system_r:svirt_t:s0 Feb 1 04:23:53 localhost nova_compute[225632]: system_u:system_r:svirt_tcg_t:s0 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: dac Feb 1 04:23:53 localhost nova_compute[225632]: 0 Feb 1 04:23:53 localhost nova_compute[225632]: +107:+107 Feb 1 04:23:53 localhost nova_compute[225632]: +107:+107 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: hvm Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 32 Feb 1 04:23:53 localhost nova_compute[225632]: /usr/libexec/qemu-kvm Feb 1 04:23:53 localhost nova_compute[225632]: pc-i440fx-rhel7.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.8.0 Feb 1 04:23:53 localhost nova_compute[225632]: q35 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.4.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.5.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.3.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel7.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.4.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.2.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.2.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.0.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.0.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.1.0 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: hvm Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 64 Feb 1 04:23:53 localhost nova_compute[225632]: /usr/libexec/qemu-kvm Feb 1 04:23:53 localhost nova_compute[225632]: pc-i440fx-rhel7.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.8.0 Feb 1 04:23:53 localhost nova_compute[225632]: q35 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.4.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.5.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.3.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel7.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.4.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.2.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.2.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.0.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.0.0 Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel8.1.0 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: #033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.150 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.156 225636 DEBUG nova.virt.libvirt.volume.mount [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.157 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/libexec/qemu-kvm Feb 1 04:23:53 localhost nova_compute[225632]: kvm Feb 1 04:23:53 localhost nova_compute[225632]: pc-i440fx-rhel7.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: i686 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: rom Feb 1 04:23:53 localhost nova_compute[225632]: pflash Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: yes Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: AMD Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 486 Feb 1 04:23:53 localhost nova_compute[225632]: 486-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Conroe Feb 1 04:23:53 localhost nova_compute[225632]: Conroe-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-IBPB Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v4 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v5 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v1 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v2 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v6 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v7 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Penryn Feb 1 04:23:53 localhost nova_compute[225632]: Penryn-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Westmere Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-v2 Feb 1 04:23:53 localhost nova_compute[225632]: athlon Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: athlon-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: core2duo Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: core2duo-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: coreduo Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: coreduo-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: kvm32 Feb 1 04:23:53 localhost nova_compute[225632]: kvm32-v1 Feb 1 04:23:53 localhost nova_compute[225632]: kvm64 Feb 1 04:23:53 localhost nova_compute[225632]: kvm64-v1 Feb 1 04:23:53 localhost nova_compute[225632]: n270 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: n270-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: pentium Feb 1 04:23:53 localhost nova_compute[225632]: pentium-v1 Feb 1 04:23:53 localhost nova_compute[225632]: pentium2 Feb 1 04:23:53 localhost nova_compute[225632]: pentium2-v1 Feb 1 04:23:53 localhost nova_compute[225632]: pentium3 Feb 1 04:23:53 localhost nova_compute[225632]: pentium3-v1 Feb 1 04:23:53 localhost nova_compute[225632]: phenom Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: phenom-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: qemu32 Feb 1 04:23:53 localhost nova_compute[225632]: qemu32-v1 Feb 1 04:23:53 localhost nova_compute[225632]: qemu64 Feb 1 04:23:53 localhost nova_compute[225632]: qemu64-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: file Feb 1 04:23:53 localhost nova_compute[225632]: anonymous Feb 1 04:23:53 localhost nova_compute[225632]: memfd Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: disk Feb 1 04:23:53 localhost nova_compute[225632]: cdrom Feb 1 04:23:53 localhost nova_compute[225632]: floppy Feb 1 04:23:53 localhost nova_compute[225632]: lun Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ide Feb 1 04:23:53 localhost nova_compute[225632]: fdc Feb 1 04:23:53 localhost nova_compute[225632]: scsi Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: sata Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: virtio-transitional Feb 1 04:23:53 localhost nova_compute[225632]: virtio-non-transitional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: vnc Feb 1 04:23:53 localhost nova_compute[225632]: egl-headless Feb 1 04:23:53 localhost nova_compute[225632]: dbus Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: subsystem Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: default Feb 1 04:23:53 localhost nova_compute[225632]: mandatory Feb 1 04:23:53 localhost nova_compute[225632]: requisite Feb 1 04:23:53 localhost nova_compute[225632]: optional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: pci Feb 1 04:23:53 localhost nova_compute[225632]: scsi Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: virtio-transitional Feb 1 04:23:53 localhost nova_compute[225632]: virtio-non-transitional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: random Feb 1 04:23:53 localhost nova_compute[225632]: egd Feb 1 04:23:53 localhost nova_compute[225632]: builtin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: path Feb 1 04:23:53 localhost nova_compute[225632]: handle Feb 1 04:23:53 localhost nova_compute[225632]: virtiofs Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: tpm-tis Feb 1 04:23:53 localhost nova_compute[225632]: tpm-crb Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: emulator Feb 1 04:23:53 localhost nova_compute[225632]: external Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 2.0 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: pty Feb 1 04:23:53 localhost nova_compute[225632]: unix Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: qemu Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: builtin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: default Feb 1 04:23:53 localhost nova_compute[225632]: passt Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: isa Feb 1 04:23:53 localhost nova_compute[225632]: hyperv Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: null Feb 1 04:23:53 localhost nova_compute[225632]: vc Feb 1 04:23:53 localhost nova_compute[225632]: pty Feb 1 04:23:53 localhost nova_compute[225632]: dev Feb 1 04:23:53 localhost nova_compute[225632]: file Feb 1 04:23:53 localhost nova_compute[225632]: pipe Feb 1 04:23:53 localhost nova_compute[225632]: stdio Feb 1 04:23:53 localhost nova_compute[225632]: udp Feb 1 04:23:53 localhost nova_compute[225632]: tcp Feb 1 04:23:53 localhost nova_compute[225632]: unix Feb 1 04:23:53 localhost nova_compute[225632]: qemu-vdagent Feb 1 04:23:53 localhost nova_compute[225632]: dbus Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: relaxed Feb 1 04:23:53 localhost nova_compute[225632]: vapic Feb 1 04:23:53 localhost nova_compute[225632]: spinlocks Feb 1 04:23:53 localhost nova_compute[225632]: vpindex Feb 1 04:23:53 localhost nova_compute[225632]: runtime Feb 1 04:23:53 localhost nova_compute[225632]: synic Feb 1 04:23:53 localhost nova_compute[225632]: stimer Feb 1 04:23:53 localhost nova_compute[225632]: reset Feb 1 04:23:53 localhost nova_compute[225632]: vendor_id Feb 1 04:23:53 localhost nova_compute[225632]: frequencies Feb 1 04:23:53 localhost nova_compute[225632]: reenlightenment Feb 1 04:23:53 localhost nova_compute[225632]: tlbflush Feb 1 04:23:53 localhost nova_compute[225632]: ipi Feb 1 04:23:53 localhost nova_compute[225632]: avic Feb 1 04:23:53 localhost nova_compute[225632]: emsr_bitmap Feb 1 04:23:53 localhost nova_compute[225632]: xmm_input Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 4095 Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Linux KVM Hv Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.165 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/libexec/qemu-kvm Feb 1 04:23:53 localhost nova_compute[225632]: kvm Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.8.0 Feb 1 04:23:53 localhost nova_compute[225632]: i686 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: rom Feb 1 04:23:53 localhost nova_compute[225632]: pflash Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: yes Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: AMD Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 486 Feb 1 04:23:53 localhost nova_compute[225632]: 486-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Conroe Feb 1 04:23:53 localhost nova_compute[225632]: Conroe-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-IBPB Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v4 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v5 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v1 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v2 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v6 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v7 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Penryn Feb 1 04:23:53 localhost nova_compute[225632]: Penryn-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Westmere Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-v2 Feb 1 04:23:53 localhost nova_compute[225632]: athlon Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: athlon-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: core2duo Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: core2duo-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: coreduo Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: coreduo-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: kvm32 Feb 1 04:23:53 localhost nova_compute[225632]: kvm32-v1 Feb 1 04:23:53 localhost nova_compute[225632]: kvm64 Feb 1 04:23:53 localhost nova_compute[225632]: kvm64-v1 Feb 1 04:23:53 localhost nova_compute[225632]: n270 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: n270-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: pentium Feb 1 04:23:53 localhost nova_compute[225632]: pentium-v1 Feb 1 04:23:53 localhost nova_compute[225632]: pentium2 Feb 1 04:23:53 localhost nova_compute[225632]: pentium2-v1 Feb 1 04:23:53 localhost nova_compute[225632]: pentium3 Feb 1 04:23:53 localhost nova_compute[225632]: pentium3-v1 Feb 1 04:23:53 localhost nova_compute[225632]: phenom Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: phenom-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: qemu32 Feb 1 04:23:53 localhost nova_compute[225632]: qemu32-v1 Feb 1 04:23:53 localhost nova_compute[225632]: qemu64 Feb 1 04:23:53 localhost nova_compute[225632]: qemu64-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: file Feb 1 04:23:53 localhost nova_compute[225632]: anonymous Feb 1 04:23:53 localhost nova_compute[225632]: memfd Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: disk Feb 1 04:23:53 localhost nova_compute[225632]: cdrom Feb 1 04:23:53 localhost nova_compute[225632]: floppy Feb 1 04:23:53 localhost nova_compute[225632]: lun Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: fdc Feb 1 04:23:53 localhost nova_compute[225632]: scsi Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: sata Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: virtio-transitional Feb 1 04:23:53 localhost nova_compute[225632]: virtio-non-transitional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: vnc Feb 1 04:23:53 localhost nova_compute[225632]: egl-headless Feb 1 04:23:53 localhost nova_compute[225632]: dbus Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: subsystem Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: default Feb 1 04:23:53 localhost nova_compute[225632]: mandatory Feb 1 04:23:53 localhost nova_compute[225632]: requisite Feb 1 04:23:53 localhost nova_compute[225632]: optional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: pci Feb 1 04:23:53 localhost nova_compute[225632]: scsi Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: virtio-transitional Feb 1 04:23:53 localhost nova_compute[225632]: virtio-non-transitional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: random Feb 1 04:23:53 localhost nova_compute[225632]: egd Feb 1 04:23:53 localhost nova_compute[225632]: builtin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: path Feb 1 04:23:53 localhost nova_compute[225632]: handle Feb 1 04:23:53 localhost nova_compute[225632]: virtiofs Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: tpm-tis Feb 1 04:23:53 localhost nova_compute[225632]: tpm-crb Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: emulator Feb 1 04:23:53 localhost nova_compute[225632]: external Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 2.0 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: pty Feb 1 04:23:53 localhost nova_compute[225632]: unix Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: qemu Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: builtin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: default Feb 1 04:23:53 localhost nova_compute[225632]: passt Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: isa Feb 1 04:23:53 localhost nova_compute[225632]: hyperv Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: null Feb 1 04:23:53 localhost nova_compute[225632]: vc Feb 1 04:23:53 localhost nova_compute[225632]: pty Feb 1 04:23:53 localhost nova_compute[225632]: dev Feb 1 04:23:53 localhost nova_compute[225632]: file Feb 1 04:23:53 localhost nova_compute[225632]: pipe Feb 1 04:23:53 localhost nova_compute[225632]: stdio Feb 1 04:23:53 localhost nova_compute[225632]: udp Feb 1 04:23:53 localhost nova_compute[225632]: tcp Feb 1 04:23:53 localhost nova_compute[225632]: unix Feb 1 04:23:53 localhost nova_compute[225632]: qemu-vdagent Feb 1 04:23:53 localhost nova_compute[225632]: dbus Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: relaxed Feb 1 04:23:53 localhost nova_compute[225632]: vapic Feb 1 04:23:53 localhost nova_compute[225632]: spinlocks Feb 1 04:23:53 localhost nova_compute[225632]: vpindex Feb 1 04:23:53 localhost nova_compute[225632]: runtime Feb 1 04:23:53 localhost nova_compute[225632]: synic Feb 1 04:23:53 localhost nova_compute[225632]: stimer Feb 1 04:23:53 localhost nova_compute[225632]: reset Feb 1 04:23:53 localhost nova_compute[225632]: vendor_id Feb 1 04:23:53 localhost nova_compute[225632]: frequencies Feb 1 04:23:53 localhost nova_compute[225632]: reenlightenment Feb 1 04:23:53 localhost nova_compute[225632]: tlbflush Feb 1 04:23:53 localhost nova_compute[225632]: ipi Feb 1 04:23:53 localhost nova_compute[225632]: avic Feb 1 04:23:53 localhost nova_compute[225632]: emsr_bitmap Feb 1 04:23:53 localhost nova_compute[225632]: xmm_input Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 4095 Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Linux KVM Hv Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.230 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.236 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/libexec/qemu-kvm Feb 1 04:23:53 localhost nova_compute[225632]: kvm Feb 1 04:23:53 localhost nova_compute[225632]: pc-i440fx-rhel7.6.0 Feb 1 04:23:53 localhost nova_compute[225632]: x86_64 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: rom Feb 1 04:23:53 localhost nova_compute[225632]: pflash Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: yes Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: AMD Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 486 Feb 1 04:23:53 localhost nova_compute[225632]: 486-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Conroe Feb 1 04:23:53 localhost nova_compute[225632]: Conroe-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-IBPB Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v4 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v5 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v1 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v2 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v6 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v7 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Penryn Feb 1 04:23:53 localhost nova_compute[225632]: Penryn-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Westmere Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Westmere-v2 Feb 1 04:23:53 localhost nova_compute[225632]: athlon Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: athlon-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: core2duo Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: core2duo-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: coreduo Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: coreduo-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: kvm32 Feb 1 04:23:53 localhost nova_compute[225632]: kvm32-v1 Feb 1 04:23:53 localhost nova_compute[225632]: kvm64 Feb 1 04:23:53 localhost nova_compute[225632]: kvm64-v1 Feb 1 04:23:53 localhost nova_compute[225632]: n270 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: n270-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: pentium Feb 1 04:23:53 localhost nova_compute[225632]: pentium-v1 Feb 1 04:23:53 localhost nova_compute[225632]: pentium2 Feb 1 04:23:53 localhost nova_compute[225632]: pentium2-v1 Feb 1 04:23:53 localhost nova_compute[225632]: pentium3 Feb 1 04:23:53 localhost nova_compute[225632]: pentium3-v1 Feb 1 04:23:53 localhost nova_compute[225632]: phenom Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: phenom-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: qemu32 Feb 1 04:23:53 localhost nova_compute[225632]: qemu32-v1 Feb 1 04:23:53 localhost nova_compute[225632]: qemu64 Feb 1 04:23:53 localhost nova_compute[225632]: qemu64-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: file Feb 1 04:23:53 localhost nova_compute[225632]: anonymous Feb 1 04:23:53 localhost nova_compute[225632]: memfd Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: disk Feb 1 04:23:53 localhost nova_compute[225632]: cdrom Feb 1 04:23:53 localhost nova_compute[225632]: floppy Feb 1 04:23:53 localhost nova_compute[225632]: lun Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ide Feb 1 04:23:53 localhost nova_compute[225632]: fdc Feb 1 04:23:53 localhost nova_compute[225632]: scsi Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: sata Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: virtio-transitional Feb 1 04:23:53 localhost nova_compute[225632]: virtio-non-transitional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: vnc Feb 1 04:23:53 localhost nova_compute[225632]: egl-headless Feb 1 04:23:53 localhost nova_compute[225632]: dbus Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: subsystem Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: default Feb 1 04:23:53 localhost nova_compute[225632]: mandatory Feb 1 04:23:53 localhost nova_compute[225632]: requisite Feb 1 04:23:53 localhost nova_compute[225632]: optional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: pci Feb 1 04:23:53 localhost nova_compute[225632]: scsi Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: virtio Feb 1 04:23:53 localhost nova_compute[225632]: virtio-transitional Feb 1 04:23:53 localhost nova_compute[225632]: virtio-non-transitional Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: random Feb 1 04:23:53 localhost nova_compute[225632]: egd Feb 1 04:23:53 localhost nova_compute[225632]: builtin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: path Feb 1 04:23:53 localhost nova_compute[225632]: handle Feb 1 04:23:53 localhost nova_compute[225632]: virtiofs Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: tpm-tis Feb 1 04:23:53 localhost nova_compute[225632]: tpm-crb Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: emulator Feb 1 04:23:53 localhost nova_compute[225632]: external Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 2.0 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: usb Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: pty Feb 1 04:23:53 localhost nova_compute[225632]: unix Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: qemu Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: builtin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: default Feb 1 04:23:53 localhost nova_compute[225632]: passt Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: isa Feb 1 04:23:53 localhost nova_compute[225632]: hyperv Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: null Feb 1 04:23:53 localhost nova_compute[225632]: vc Feb 1 04:23:53 localhost nova_compute[225632]: pty Feb 1 04:23:53 localhost nova_compute[225632]: dev Feb 1 04:23:53 localhost nova_compute[225632]: file Feb 1 04:23:53 localhost nova_compute[225632]: pipe Feb 1 04:23:53 localhost nova_compute[225632]: stdio Feb 1 04:23:53 localhost nova_compute[225632]: udp Feb 1 04:23:53 localhost nova_compute[225632]: tcp Feb 1 04:23:53 localhost nova_compute[225632]: unix Feb 1 04:23:53 localhost nova_compute[225632]: qemu-vdagent Feb 1 04:23:53 localhost nova_compute[225632]: dbus Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: relaxed Feb 1 04:23:53 localhost nova_compute[225632]: vapic Feb 1 04:23:53 localhost nova_compute[225632]: spinlocks Feb 1 04:23:53 localhost nova_compute[225632]: vpindex Feb 1 04:23:53 localhost nova_compute[225632]: runtime Feb 1 04:23:53 localhost nova_compute[225632]: synic Feb 1 04:23:53 localhost nova_compute[225632]: stimer Feb 1 04:23:53 localhost nova_compute[225632]: reset Feb 1 04:23:53 localhost nova_compute[225632]: vendor_id Feb 1 04:23:53 localhost nova_compute[225632]: frequencies Feb 1 04:23:53 localhost nova_compute[225632]: reenlightenment Feb 1 04:23:53 localhost nova_compute[225632]: tlbflush Feb 1 04:23:53 localhost nova_compute[225632]: ipi Feb 1 04:23:53 localhost nova_compute[225632]: avic Feb 1 04:23:53 localhost nova_compute[225632]: emsr_bitmap Feb 1 04:23:53 localhost nova_compute[225632]: xmm_input Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 4095 Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Linux KVM Hv Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:23:53 localhost nova_compute[225632]: 2026-02-01 09:23:53.296 225636 DEBUG nova.virt.libvirt.host [None req-116d0e0e-b197-4045-9ad6-620b47300366 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/libexec/qemu-kvm Feb 1 04:23:53 localhost nova_compute[225632]: kvm Feb 1 04:23:53 localhost nova_compute[225632]: pc-q35-rhel9.8.0 Feb 1 04:23:53 localhost nova_compute[225632]: x86_64 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: efi Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 1 04:23:53 localhost nova_compute[225632]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 1 04:23:53 localhost nova_compute[225632]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 1 04:23:53 localhost nova_compute[225632]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: rom Feb 1 04:23:53 localhost nova_compute[225632]: pflash Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: yes Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: yes Feb 1 04:23:53 localhost nova_compute[225632]: no Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: on Feb 1 04:23:53 localhost nova_compute[225632]: off Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: AMD Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: 486 Feb 1 04:23:53 localhost nova_compute[225632]: 486-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Broadwell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cascadelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: ClearwaterForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Conroe Feb 1 04:23:53 localhost nova_compute[225632]: Conroe-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Cooperlake-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Denverton-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Dhyana-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Genoa-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-IBPB Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Milan-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v4 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Rome-v5 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-Turin-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v1 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v2 Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: EPYC-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: GraniteRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Haswell-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-noTSX Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v6 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Icelake-Server-v7 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: IvyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: KnightsMill-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Nehalem-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G1-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G2-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G3-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G4-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Opteron_G5-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Penryn Feb 1 04:23:53 localhost nova_compute[225632]: Penryn-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: SandyBridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SapphireRapids-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: SierraForest-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Client-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-noTSX-IBRS Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v3 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v4 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Skylake-Server-v5 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v1 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Snowridge-v2 Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:23:53 localhost nova_compute[225632]: Feb 1 04:28:47 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:28:48 localhost nova_compute[225632]: 2026-02-01 09:28:48.101 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:28:48 localhost rsyslogd[758]: imjournal: 5026 messages lost due to rate-limiting (20000 allowed within 600 seconds) Feb 1 04:28:48 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:28:49 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:28:49 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:28:49 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:28:49 localhost nova_compute[225632]: 2026-02-01 09:28:49.763 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:28:50 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:28:50 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:28:50 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:28:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64136 DF PROTO=TCP SPT=41972 DPT=9100 SEQ=3121266865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D019DB80000000001030307) Feb 1 04:28:52 localhost nova_compute[225632]: 2026-02-01 09:28:52.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:52 localhost nova_compute[225632]: 2026-02-01 09:28:52.407 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 04:28:52 localhost nova_compute[225632]: 2026-02-01 09:28:52.429 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 04:28:52 localhost nova_compute[225632]: 2026-02-01 09:28:52.432 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:52 localhost nova_compute[225632]: 2026-02-01 09:28:52.432 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 04:28:52 localhost nova_compute[225632]: 2026-02-01 09:28:52.450 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:28:52 localhost podman[243593]: 2026-02-01 09:28:52.762584967 +0000 UTC m=+0.129584520 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:28:52 localhost podman[243593]: 2026-02-01 09:28:52.771405191 +0000 UTC m=+0.138404754 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:28:52 localhost podman[243593]: unhealthy Feb 1 04:28:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57842 DF PROTO=TCP SPT=50016 DPT=9102 SEQ=1113152743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D019FB90000000001030307) Feb 1 04:28:52 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 1 04:28:53 localhost systemd[1]: var-lib-containers-storage-overlay-a7b403e25c18c4fc322532dcaf847548439fdb3d0df10999e0b3f91fea4ca5cb-merged.mount: Deactivated successfully. Feb 1 04:28:53 localhost nova_compute[225632]: 2026-02-01 09:28:53.155 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:28:53 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:28:53 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Failed with result 'exit-code'. Feb 1 04:28:54 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:28:54 localhost nova_compute[225632]: 2026-02-01 09:28:54.464 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:54 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 1 04:28:54 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 1 04:28:54 localhost nova_compute[225632]: 2026-02-01 09:28:54.804 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:28:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65373 DF PROTO=TCP SPT=39592 DPT=9882 SEQ=1227837583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01A7B80000000001030307) Feb 1 04:28:55 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:28:55 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:28:55 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:28:56 localhost nova_compute[225632]: 2026-02-01 09:28:56.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:56 localhost nova_compute[225632]: 2026-02-01 09:28:56.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:56 localhost nova_compute[225632]: 2026-02-01 09:28:56.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:56 localhost nova_compute[225632]: 2026-02-01 09:28:56.407 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:28:56 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:28:56 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:28:56 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:28:56 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:28:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65374 DF PROTO=TCP SPT=39592 DPT=9882 SEQ=1227837583 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01AFB80000000001030307) Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.403 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.405 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.406 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.406 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.556 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.556 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.557 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:28:57 localhost nova_compute[225632]: 2026-02-01 09:28:57.557 225636 DEBUG nova.objects.instance [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:28:57 localhost systemd[1]: session-55.scope: Deactivated successfully. Feb 1 04:28:57 localhost systemd[1]: session-55.scope: Consumed 1min 25.582s CPU time. Feb 1 04:28:57 localhost systemd-logind[759]: Session 55 logged out. Waiting for processes to exit. Feb 1 04:28:57 localhost systemd-logind[759]: Removed session 55. Feb 1 04:28:57 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 1 04:28:58 localhost systemd[1]: var-lib-containers-storage-overlay-e75586bb4ab2bd9f0af5c6046e55c6950ec71393a8ae3185df7c4d9365a6d82a-merged.mount: Deactivated successfully. Feb 1 04:28:58 localhost nova_compute[225632]: 2026-02-01 09:28:58.047 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:28:58 localhost nova_compute[225632]: 2026-02-01 09:28:58.070 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:28:58 localhost nova_compute[225632]: 2026-02-01 09:28:58.071 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:28:58 localhost nova_compute[225632]: 2026-02-01 09:28:58.071 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:58 localhost nova_compute[225632]: 2026-02-01 09:28:58.191 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:28:58 localhost systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully. Feb 1 04:28:58 localhost systemd[1]: var-lib-containers-storage-overlay-ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2-merged.mount: Deactivated successfully. Feb 1 04:28:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:28:58 localhost systemd[1]: var-lib-containers-storage-overlay-ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2-merged.mount: Deactivated successfully. Feb 1 04:28:58 localhost podman[243619]: 2026-02-01 09:28:58.908179651 +0000 UTC m=+0.077137133 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Feb 1 04:28:58 localhost podman[243619]: 2026-02-01 09:28:58.917315974 +0000 UTC m=+0.086273406 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent) Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.424 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.424 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.425 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.425 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.425 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:28:59 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:28:59 localhost systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully. Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.840 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:28:59 localhost systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully. Feb 1 04:28:59 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.885 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:28:59 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.964 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:28:59 localhost nova_compute[225632]: 2026-02-01 09:28:59.964 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:29:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26152 DF PROTO=TCP SPT=56634 DPT=9101 SEQ=3334032069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01BB9A0000000001030307) Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.182 225636 WARNING nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.184 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12471MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.185 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.185 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.353 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.353 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.354 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.439 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.537 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.537 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.564 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.614 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:29:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:29:00 localhost nova_compute[225632]: 2026-02-01 09:29:00.669 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:29:00 localhost podman[243659]: 2026-02-01 09:29:00.723741649 +0000 UTC m=+0.084018298 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:29:00 localhost podman[243659]: 2026-02-01 09:29:00.733281965 +0000 UTC m=+0.093558624 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:29:01 localhost nova_compute[225632]: 2026-02-01 09:29:01.141 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:29:01 localhost nova_compute[225632]: 2026-02-01 09:29:01.145 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:29:01 localhost nova_compute[225632]: 2026-02-01 09:29:01.177 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:29:01 localhost nova_compute[225632]: 2026-02-01 09:29:01.179 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:29:01 localhost nova_compute[225632]: 2026-02-01 09:29:01.179 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.994s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 1 04:29:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 1 04:29:01 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:29:01 localhost podman[243702]: 2026-02-01 09:29:01.446730095 +0000 UTC m=+0.090512138 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible) Feb 1 04:29:01 localhost podman[243702]: 2026-02-01 09:29:01.487406788 +0000 UTC m=+0.131188831 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:01 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:02 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:02 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:29:02 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 1 04:29:03 localhost systemd[1]: var-lib-containers-storage-overlay-cddf7a814a1cba1a77861bcad7ee6ec0fe12286db1bf4a70c953226de72c826e-merged.mount: Deactivated successfully. Feb 1 04:29:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35800 DF PROTO=TCP SPT=54960 DPT=9105 SEQ=4100145686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01C7B80000000001030307) Feb 1 04:29:03 localhost nova_compute[225632]: 2026-02-01 09:29:03.193 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.520 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.551 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 197023361 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.551 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 24174444 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b99296e2-b3d1-42c9-b6e5-5df6bac9a29e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 197023361, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.521321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72b67a1a-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '5706c9f59a823f908956bfd068b5142a0388457fdd0c5ce9093a77dd9c0c33dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24174444, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.521321', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72b68eec-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '3ce527bba46548dd11f9d859ad1f73e67dcab970f81635971cbfaafd9eec45a2'}]}, 'timestamp': '2026-02-01 09:29:03.552159', '_unique_id': 'c17edfd1f3cc44668363f5ab927eaf23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.553 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.555 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.561 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f603406-df7e-4651-a5b7-42285ec593dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.555197', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72b81b5e-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': 'bca23d0fbc9f80c350800a7c83286dd3eb59ea2b25f4b8eae674755375d7b94f'}]}, 'timestamp': '2026-02-01 09:29:03.562276', '_unique_id': '8bfa746bf75f484384ec98c7f3e2ab9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.564 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.564 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cec55fa3-174b-4c71-ae4e-7f1433503bcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.564593', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72b88940-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': '8e3ddbf991c572764d9e0b6ed4cba378f2794dc1378902e19344224f02f6f65b'}]}, 'timestamp': '2026-02-01 09:29:03.565116', '_unique_id': 'a3050db24ef84dc18098624674343fee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.567 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.567 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.567 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.567 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c040807-63a4-4d8f-8d80-a714bd07c549', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.567729', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72b903e8-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': '00c6cfb7fe1af190836b6525f9a6be91e677cb1c413306354a9bbcd692ca95f3'}]}, 'timestamp': '2026-02-01 09:29:03.568254', '_unique_id': 'c077d2715a6944a2aa43898fa755fcb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.571 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.571 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 572 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.571 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73eb6288-f1aa-4027-b919-f06d1634c777', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 572, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.571217', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72b98cdc-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '0fab621179a4f5f84e65a2d78edfef86202f28f46d82d4f1e0e571388612a252'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.571217', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72b9a2c6-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '300faebe0459a008a0b15358efa01af0868be414b6a9442bd20379200099dc9f'}]}, 'timestamp': '2026-02-01 09:29:03.572262', '_unique_id': '4f73be3bd0ad461ab3b01b6f7ce7c502'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.573 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.574 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.574 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.575 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4448f42b-e085-4337-a34f-5bfcd1760b5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.574607', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72ba1012-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': 'a1522137556114d3bc501073e4de838895ec808b6e8fac93b0ad648c6c52b1ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.574607', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72ba228c-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '5566faf0dea513e8f9fc306c139de5b2f83fc2380a3311e8a46d99b2f78df6ed'}]}, 'timestamp': '2026-02-01 09:29:03.575523', '_unique_id': '4ccf10fb16b7497986f3fc440e1c0514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.578 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d8c1e5a-1b62-472c-a180-b8271cb0345b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.577964', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72ba9578-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': '0fc379a4bcd7bc90a61047b0bf5fab2017a8648829ada677e2306dbf4f8a749f'}]}, 'timestamp': '2026-02-01 09:29:03.579945', '_unique_id': 'e5cb50e01d4940c1af1a2e4e6c0429a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9be49b8a-06d4-40a4-83f8-e8411f9cd4ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.584204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72bb88ac-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '61af1d83846c6fea2b3a47b64c6b700c79fb6e44e5bc3cd84528f5384869d672'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.584204', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72bb99fa-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '08aaf734cbaefaaee1284ec662447c86cb68f5970c51fc1f9f90b389f345af85'}]}, 'timestamp': '2026-02-01 09:29:03.585168', '_unique_id': 'c227495351fa4b6b812e80a08424c00c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.598 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.598 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04ccd97d-e104-4cba-bb87-6c41c377e73d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.587455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72bdb06e-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.80691201, 'message_signature': 'c05cf963fa26b8794a5e3bc3c97772ea64275ae4c4d9b2084e7acac7c7cb8663'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.587455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72bdc2d4-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.80691201, 'message_signature': 'ac52c9196eb6da4be46aeee130e6d041245cfeccbe27aad0f011544e79003381'}]}, 'timestamp': '2026-02-01 09:29:03.599289', '_unique_id': '5e7012580034408da3e6272ae4671b8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.601 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.602 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33fcb41a-9f10-4a06-9d09-05e06435e4fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.601672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72be3160-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': 'e707729b9a88b9542970431dcd4d7a64919aaf92a2addc9081047141ecc939e8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.601672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72be43c6-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '99897568a6ecf2782aee3b14e383d601826b06bf22e02d16c2c143bb1ddf4252'}]}, 'timestamp': '2026-02-01 09:29:03.602585', '_unique_id': '458d4ef7eca0473992409010978c131d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.604 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 52.3984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1436cde-7e65-4b4e-9bc8-811837ac1ab3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3984375, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:29:03.605023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '72c16452-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.841886304, 'message_signature': 'a293632b831f53eef402ff6b427b667aef11dcf85478bbe653470d06a28b81c2'}]}, 'timestamp': '2026-02-01 09:29:03.623154', '_unique_id': '99070f5098fa4c859f489667057411af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.625 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.625 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '154202bb-1c0f-456d-a500-7b385f1df3e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.625370', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72c1cfa0-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': '66a0ad56d0e24a1ae3c0dc126c9c8088a556a8541a14d909a455ea1e8f4f3f8a'}]}, 'timestamp': '2026-02-01 09:29:03.625858', '_unique_id': '24e8285ce810487fa00944b5d1f13d50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.628 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd9e79aa-de43-4744-88e4-5e3b9eee4d0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.628188', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72c23db4-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': '0a7e3ca1700c31fc9d6414b47d729337651140c45f49a28c0442d340236224e2'}]}, 'timestamp': '2026-02-01 09:29:03.628676', '_unique_id': '31b2e1baa09b4b60bf74b4f183e0fad8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.631 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.631 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcdb4bc8-ce05-4930-a715-1cfa3604d70b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.631245', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72c2b582-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': 'c55ef8bfca4e6085d4a5a8aba879aab8bd04fb53f2b55bc73ed5be8c7571d733'}]}, 'timestamp': '2026-02-01 09:29:03.631752', '_unique_id': '8887a1e07aef48cd8c6f91610c3a5049'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.634 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.634 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.634 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a79a9d9-3182-4df9-b5c0-6047dec784df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.634164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72c327ec-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.80691201, 'message_signature': '01507a3adeae96135ee13885687755cf4bc46771fa176fadb93faf3ba6279108'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.634164', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72c33d54-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.80691201, 'message_signature': 'f6c169c68aa72a1de59523b64ef954cc4f2b5d4b896304e46a155d6876d963ea'}]}, 'timestamp': '2026-02-01 09:29:03.635232', '_unique_id': '96913248a14a47738843a8fb1ee22baf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.637 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.637 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b8e9293-3086-451c-b2a3-989b6b5af463', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.637625', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72c3adfc-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.80691201, 'message_signature': '7ba7d524488166f4d81361239919405f4b7af5c6134203906c8a51ad66315ca4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.637625', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72c3c01c-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.80691201, 'message_signature': 'de9210e171d43f1b3c9d36d06d75199856709942e9148834d7d429c23af23a9e'}]}, 'timestamp': '2026-02-01 09:29:03.638537', '_unique_id': '040b496785dc430ea870984a6e30b75c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.640 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19d96a55-2c66-491b-be28-f195ef04025d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.640719', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72c42a02-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': 'de0939b5728a3ea06c06750efd4e50c7e75bcca36a3396e5387710d6a2ec677d'}]}, 'timestamp': '2026-02-01 09:29:03.641286', '_unique_id': 'bdc9e193ecd842d2925776532f0e72b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.643 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1227122553 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 165637656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '407f1634-c5aa-4776-b501-855295117177', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1227122553, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:29:03.643487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '72c49352-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': 'fb0bda1ff7277456eeacbc0c5ff4f42699229c557443acf1f8da3a621ac4b0e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165637656, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:29:03.643487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '72c4a612-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.740763518, 'message_signature': '65746523e3f224828150a014eab41be76b8616cc93c87098156b7c0c40fc04c3'}]}, 'timestamp': '2026-02-01 09:29:03.644452', '_unique_id': 'a966aef4333f4ce7a19bc8fa980968bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.646 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.646 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95352174-86a8-4143-b86b-a004f245a40c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.646827', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72c517f0-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': 'e0123930864e41dc57534ad888ee0782f3f42f1e021080bb7ff4c373b5fcc687'}]}, 'timestamp': '2026-02-01 09:29:03.647375', '_unique_id': 'd3a17534a0e7489a8dfa2ad7ad427f48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.649 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f267c9db-47b1-4aa5-949b-6761016571a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:29:03.649548', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '72c57fec-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.774644629, 'message_signature': 'c2a302f10960758989ccb29261aea0f5f6ae9ff885c686f230606b227dc95a74'}]}, 'timestamp': '2026-02-01 09:29:03.650067', '_unique_id': '7efc6bba69874cd59f09bcfca2500eee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.652 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.652 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 57110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed9d5d41-9275-43ad-935a-17744876f766', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57110000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:29:03.652228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '72c5e856-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10237.841886304, 'message_signature': '8e2770e3e0cbf1cfc15a76e002ae624478c3880a36d1db737baa6230d572db15'}]}, 'timestamp': '2026-02-01 09:29:03.652692', '_unique_id': '438cea7c294e4a379fdd5c7a8941765e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:29:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:29:03.654 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:29:04 localhost nova_compute[225632]: 2026-02-01 09:29:04.862 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:05 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:05 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:05 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26155 DF PROTO=TCP SPT=56634 DPT=9101 SEQ=3334032069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01D7780000000001030307) Feb 1 04:29:07 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:29:07 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:07 localhost podman[243726]: 2026-02-01 09:29:07.713380003 +0000 UTC m=+0.079510027 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-type=git, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, release=1769056855, version=9.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:29:07 localhost podman[243726]: 2026-02-01 09:29:07.730400191 +0000 UTC m=+0.096530205 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, release=1769056855, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, architecture=x86_64) Feb 1 04:29:07 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:07 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:29:08 localhost nova_compute[225632]: 2026-02-01 09:29:08.233 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:08 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:08 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:09 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57843 DF PROTO=TCP SPT=50016 DPT=9102 SEQ=1113152743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01DFB80000000001030307) Feb 1 04:29:09 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:09 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:09 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:09 localhost nova_compute[225632]: 2026-02-01 09:29:09.884 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:29:11 localhost podman[243744]: 2026-02-01 09:29:11.747775148 +0000 UTC m=+0.107112573 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 1 04:29:11 localhost podman[243744]: 2026-02-01 09:29:11.784527919 +0000 UTC m=+0.143865334 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:29:12 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:12 localhost systemd[1]: var-lib-containers-storage-overlay-7482678b01acdac3c11b06ce351fa229c86390ea5e0f34b89b4352735f8e55c8-merged.mount: Deactivated successfully. Feb 1 04:29:13 localhost systemd[1]: var-lib-containers-storage-overlay-7482678b01acdac3c11b06ce351fa229c86390ea5e0f34b89b4352735f8e55c8-merged.mount: Deactivated successfully. Feb 1 04:29:13 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:29:13 localhost nova_compute[225632]: 2026-02-01 09:29:13.234 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38621 DF PROTO=TCP SPT=57680 DPT=9100 SEQ=3184778915 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01F3790000000001030307) Feb 1 04:29:14 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 1 04:29:14 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 1 04:29:14 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 1 04:29:14 localhost nova_compute[225632]: 2026-02-01 09:29:14.928 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26156 DF PROTO=TCP SPT=56634 DPT=9101 SEQ=3334032069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D01F7B90000000001030307) Feb 1 04:29:16 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:16 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 1 04:29:16 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 1 04:29:17 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:17 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:17 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:18 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:18 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:18 localhost nova_compute[225632]: 2026-02-01 09:29:18.236 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:19 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:19 localhost nova_compute[225632]: 2026-02-01 09:29:19.955 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:20 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 1 04:29:20 localhost systemd[1]: var-lib-containers-storage-overlay-81d35b92007b3f6ce5557fdf3066e145410fe8d50cd18a17188fe6e802a41d49-merged.mount: Deactivated successfully. Feb 1 04:29:20 localhost systemd[1]: var-lib-containers-storage-overlay-81d35b92007b3f6ce5557fdf3066e145410fe8d50cd18a17188fe6e802a41d49-merged.mount: Deactivated successfully. Feb 1 04:29:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38622 DF PROTO=TCP SPT=57680 DPT=9100 SEQ=3184778915 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0213B90000000001030307) Feb 1 04:29:22 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:22 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:23 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19654 DF PROTO=TCP SPT=52224 DPT=9102 SEQ=1820219784 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0215B80000000001030307) Feb 1 04:29:23 localhost nova_compute[225632]: 2026-02-01 09:29:23.242 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:29:23 localhost podman[243763]: 2026-02-01 09:29:23.722029943 +0000 UTC m=+0.084836202 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:29:23 localhost podman[243763]: 2026-02-01 09:29:23.761485237 +0000 UTC m=+0.124291526 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:29:23 localhost podman[243763]: unhealthy Feb 1 04:29:24 localhost sshd[243785]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:29:24 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:24 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55299 DF PROTO=TCP SPT=41700 DPT=9882 SEQ=3275112225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D021CF80000000001030307) Feb 1 04:29:24 localhost nova_compute[225632]: 2026-02-01 09:29:24.994 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:25 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:25 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:29:25 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Failed with result 'exit-code'. Feb 1 04:29:26 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:26 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:26 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:27 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55300 DF PROTO=TCP SPT=41700 DPT=9882 SEQ=3275112225 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0224F80000000001030307) Feb 1 04:29:27 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:27 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:28 localhost nova_compute[225632]: 2026-02-01 09:29:28.244 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:29 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:29:29 localhost systemd[1]: tmp-crun.DFr6kD.mount: Deactivated successfully. Feb 1 04:29:29 localhost podman[243787]: 2026-02-01 09:29:29.629431238 +0000 UTC m=+0.092859052 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:29:29 localhost systemd[1]: var-lib-containers-storage-overlay-a3725e54853595614926889eb99d8b5ab03502eb966cd4ee026013d34265250f-merged.mount: Deactivated successfully. Feb 1 04:29:29 localhost podman[243787]: 2026-02-01 09:29:29.667716255 +0000 UTC m=+0.131144069 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:29:29 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:29:30 localhost nova_compute[225632]: 2026-02-01 09:29:30.032 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46413 DF PROTO=TCP SPT=35360 DPT=9101 SEQ=236788637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0230CA0000000001030307) Feb 1 04:29:30 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 1 04:29:30 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 1 04:29:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:29:31 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:31 localhost systemd[1]: tmp-crun.2Bwaj1.mount: Deactivated successfully. Feb 1 04:29:31 localhost podman[243805]: 2026-02-01 09:29:31.624508414 +0000 UTC m=+0.091473449 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:29:31 localhost podman[243805]: 2026-02-01 09:29:31.658511099 +0000 UTC m=+0.125476184 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:29:31 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:29:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:29:32 localhost podman[243828]: 2026-02-01 09:29:32.483048375 +0000 UTC m=+0.091753547 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:29:32 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 1 04:29:32 localhost podman[243828]: 2026-02-01 09:29:32.548373961 +0000 UTC m=+0.157079113 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:29:32 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:32 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:32 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:29:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46415 DF PROTO=TCP SPT=35360 DPT=9101 SEQ=236788637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D023CB80000000001030307) Feb 1 04:29:33 localhost nova_compute[225632]: 2026-02-01 09:29:33.245 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:33 localhost sshd[243851]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:29:33 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 1 04:29:33 localhost systemd[1]: var-lib-containers-storage-overlay-dd1aa08156f9d09a864094f6714c2c9a2978317562a3f35bb8262e99f62ead42-merged.mount: Deactivated successfully. Feb 1 04:29:33 localhost systemd-logind[759]: New session 56 of user zuul. Feb 1 04:29:33 localhost systemd[1]: Started Session 56 of User zuul. Feb 1 04:29:34 localhost python3.9[243947]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 1 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 1 04:29:34 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 1 04:29:35 localhost python3.9[244070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:35 localhost nova_compute[225632]: 2026-02-01 09:29:35.053 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:35 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 1 04:29:36 localhost python3.9[244158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938174.5483673-3714-130660365387421/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:36 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:36 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46416 DF PROTO=TCP SPT=35360 DPT=9101 SEQ=236788637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D024C780000000001030307) Feb 1 04:29:37 localhost python3.9[244268]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:37 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 1 04:29:37 localhost systemd[1]: var-lib-containers-storage-overlay-412ecd0722770303c70661441b3031022487ea82d3402d0d51ca72c2eaf9a882-merged.mount: Deactivated successfully. Feb 1 04:29:37 localhost python3.9[244378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:38 localhost nova_compute[225632]: 2026-02-01 09:29:38.273 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:29:38 localhost podman[244436]: 2026-02-01 09:29:38.400499912 +0000 UTC m=+0.091381946 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, release=1769056855, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:29:38 localhost podman[244436]: 2026-02-01 09:29:38.418475749 +0000 UTC m=+0.109357743 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1769056855) Feb 1 04:29:38 localhost python3.9[244435]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38623 DF PROTO=TCP SPT=57680 DPT=9100 SEQ=3184778915 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0253B90000000001030307) Feb 1 04:29:39 localhost python3.9[244565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:39 localhost python3.9[244622]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.blfrfqhy recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:39 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:39 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:40 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:40 localhost nova_compute[225632]: 2026-02-01 09:29:40.085 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:40 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:29:40 localhost python3.9[244732]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:41 localhost python3.9[244789]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:29:41.687 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:29:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:29:41.688 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:29:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:29:41.691 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:29:41 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:41 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:41 localhost python3.9[244899]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:29:42 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:29:42 localhost python3[245010]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 1 04:29:43 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:29:43 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:43 localhost nova_compute[225632]: 2026-02-01 09:29:43.299 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:43 localhost podman[245082]: 2026-02-01 09:29:43.348670701 +0000 UTC m=+0.107210006 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:29:43 localhost podman[245082]: 2026-02-01 09:29:43.361230661 +0000 UTC m=+0.119769956 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:29:43 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:29:43 localhost python3.9[245138]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:44 localhost python3.9[245195]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:44 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:44 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22081 DF PROTO=TCP SPT=39476 DPT=9100 SEQ=1550065515 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0268780000000001030307) Feb 1 04:29:44 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:44 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:44 localhost python3.9[245305]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:45 localhost nova_compute[225632]: 2026-02-01 09:29:45.119 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:45 localhost python3.9[245362]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46417 DF PROTO=TCP SPT=35360 DPT=9101 SEQ=236788637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D026DB90000000001030307) Feb 1 04:29:46 localhost python3.9[245472]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:47 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 1 04:29:47 localhost systemd[1]: var-lib-containers-storage-overlay-1a64c0d311ac08180312bae5177c3bf7c3bcda01d21be994fcede7c05d63d280-merged.mount: Deactivated successfully. Feb 1 04:29:47 localhost systemd[1]: var-lib-containers-storage-overlay-1a64c0d311ac08180312bae5177c3bf7c3bcda01d21be994fcede7c05d63d280-merged.mount: Deactivated successfully. Feb 1 04:29:48 localhost nova_compute[225632]: 2026-02-01 09:29:48.301 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:48 localhost python3.9[245529]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:49 localhost python3.9[245639]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:49 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 1 04:29:49 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 1 04:29:49 localhost python3.9[245696]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:50 localhost nova_compute[225632]: 2026-02-01 09:29:50.162 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:50 localhost python3.9[245806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:29:51 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:51 localhost python3.9[245896]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769938190.1456401-4089-115297227058277/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:51 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 1 04:29:51 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 1 04:29:52 localhost python3.9[246006]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3660 DF PROTO=TCP SPT=36744 DPT=9102 SEQ=1852379497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0289B80000000001030307) Feb 1 04:29:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22082 DF PROTO=TCP SPT=39476 DPT=9100 SEQ=1550065515 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0289B80000000001030307) Feb 1 04:29:52 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:52 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:53 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:29:53 localhost python3.9[246116]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:29:53 localhost nova_compute[225632]: 2026-02-01 09:29:53.304 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:53 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:53 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:54 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:54 localhost python3.9[246229]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:54 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:54 localhost python3.9[246339]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:29:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15288 DF PROTO=TCP SPT=32824 DPT=9882 SEQ=1700382325 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0292380000000001030307) Feb 1 04:29:55 localhost nova_compute[225632]: 2026-02-01 09:29:55.166 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:29:55 localhost podman[246450]: 2026-02-01 09:29:55.674037138 +0000 UTC m=+0.087533137 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:29:55 localhost podman[246450]: 2026-02-01 09:29:55.683848142 +0000 UTC m=+0.097344111 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:29:55 localhost podman[246450]: unhealthy Feb 1 04:29:55 localhost python3.9[246451]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:29:56 localhost nova_compute[225632]: 2026-02-01 09:29:56.180 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:56 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 1 04:29:56 localhost systemd[1]: var-lib-containers-storage-overlay-4a853575c374606d160c17f2d5aa41b97e721ba5a6a30f8d45cba4a87628c37d-merged.mount: Deactivated successfully. Feb 1 04:29:56 localhost nova_compute[225632]: 2026-02-01 09:29:56.402 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:56 localhost nova_compute[225632]: 2026-02-01 09:29:56.434 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:56 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Main process exited, code=exited, status=1/FAILURE Feb 1 04:29:56 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Failed with result 'exit-code'. Feb 1 04:29:56 localhost python3.9[246585]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:29:57 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 1 04:29:57 localhost nova_compute[225632]: 2026-02-01 09:29:57.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:57 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:57 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:57 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:29:58 localhost python3.9[246698]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.305 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:29:58 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.407 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.407 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:29:58 localhost systemd[1]: session-56.scope: Deactivated successfully. Feb 1 04:29:58 localhost systemd[1]: session-56.scope: Consumed 13.786s CPU time. Feb 1 04:29:58 localhost systemd-logind[759]: Session 56 logged out. Waiting for processes to exit. Feb 1 04:29:58 localhost systemd-logind[759]: Removed session 56. Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.652 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.652 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.653 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:29:58 localhost nova_compute[225632]: 2026-02-01 09:29:58.653 225636 DEBUG nova.objects.instance [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:29:59 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 1 04:29:59 localhost systemd[1]: var-lib-containers-storage-overlay-f37f2c5aca5d6b7cf023a2735a5fd14989ee6decd2a2c44bb6d47fd78dfeef3e-merged.mount: Deactivated successfully. Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.588 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.618 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.619 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.619 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.620 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.620 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.620 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.621 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.646 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.647 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.647 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.647 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:29:59 localhost nova_compute[225632]: 2026-02-01 09:29:59.648 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.105 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.198 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.247 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.248 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.491 225636 WARNING nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.493 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12438MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.493 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.493 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.573 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.574 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.574 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:30:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:30:00 localhost nova_compute[225632]: 2026-02-01 09:30:00.669 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:30:00 localhost podman[246738]: 2026-02-01 09:30:00.742658564 +0000 UTC m=+0.095550044 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent) Feb 1 04:30:00 localhost podman[246738]: 2026-02-01 09:30:00.747291669 +0000 UTC m=+0.100183149 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:30:01 localhost nova_compute[225632]: 2026-02-01 09:30:01.117 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:30:01 localhost nova_compute[225632]: 2026-02-01 09:30:01.126 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:30:01 localhost nova_compute[225632]: 2026-02-01 09:30:01.149 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:30:01 localhost nova_compute[225632]: 2026-02-01 09:30:01.152 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:30:01 localhost nova_compute[225632]: 2026-02-01 09:30:01.152 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:30:01 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:30:01 localhost openstack_network_exporter[239441]: ERROR 09:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:30:01 localhost openstack_network_exporter[239441]: Feb 1 04:30:01 localhost openstack_network_exporter[239441]: ERROR 09:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:30:01 localhost openstack_network_exporter[239441]: Feb 1 04:30:01 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 1 04:30:01 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 1 04:30:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:30:01 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:30:01 localhost systemd[1]: tmp-crun.jOaX0y.mount: Deactivated successfully. Feb 1 04:30:01 localhost podman[246783]: 2026-02-01 09:30:01.933212655 +0000 UTC m=+0.096358411 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:30:01 localhost podman[246783]: 2026-02-01 09:30:01.966088255 +0000 UTC m=+0.129233971 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:30:02 localhost nova_compute[225632]: 2026-02-01 09:30:02.149 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:30:03 localhost nova_compute[225632]: 2026-02-01 09:30:03.307 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:30:03 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:30:03 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:30:03 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 1 04:30:04 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:30:04 localhost podman[246805]: 2026-02-01 09:30:04.064187297 +0000 UTC m=+0.423082965 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:30:04 localhost podman[246805]: 2026-02-01 09:30:04.14232042 +0000 UTC m=+0.501216108 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:30:04 localhost sshd[246830]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:30:04 localhost systemd-logind[759]: New session 57 of user zuul. Feb 1 04:30:04 localhost systemd[1]: Started Session 57 of User zuul. Feb 1 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 1 04:30:05 localhost nova_compute[225632]: 2026-02-01 09:30:05.241 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:05 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:30:05 localhost python3.9[246943]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config/container-startup-config/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 1 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:30:05 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 1 04:30:06 localhost python3.9[247053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:06 localhost python3.9[247163]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62675 DF PROTO=TCP SPT=58900 DPT=9102 SEQ=2297373858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D02C3920000000001030307) Feb 1 04:30:07 localhost python3.9[247271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:08 localhost nova_compute[225632]: 2026-02-01 09:30:08.310 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:08 localhost python3.9[247357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938207.213904-99-84584726839135/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:08 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 1 04:30:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62676 DF PROTO=TCP SPT=58900 DPT=9102 SEQ=2297373858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D02C7B90000000001030307) Feb 1 04:30:08 localhost systemd[1]: var-lib-containers-storage-overlay-a27148de7468c07eed6e32a3dde6476496248367a0cf1cb0c4fa7a063e687a70-merged.mount: Deactivated successfully. Feb 1 04:30:08 localhost systemd[1]: var-lib-containers-storage-overlay-a27148de7468c07eed6e32a3dde6476496248367a0cf1cb0c4fa7a063e687a70-merged.mount: Deactivated successfully. Feb 1 04:30:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3661 DF PROTO=TCP SPT=36744 DPT=9102 SEQ=1852379497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D02C9B80000000001030307) Feb 1 04:30:09 localhost python3.9[247465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:10 localhost nova_compute[225632]: 2026-02-01 09:30:10.295 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:10 localhost python3.9[247551]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938209.5460546-99-250279634828325/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:30:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62677 DF PROTO=TCP SPT=58900 DPT=9102 SEQ=2297373858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D02CFB80000000001030307) Feb 1 04:30:10 localhost podman[247558]: 2026-02-01 09:30:10.743839086 +0000 UTC m=+0.096930429 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.7, build-date=2026-01-22T05:09:47Z, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public) Feb 1 04:30:10 localhost podman[247558]: 2026-02-01 09:30:10.785484738 +0000 UTC m=+0.138576051 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z) Feb 1 04:30:10 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 1 04:30:10 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 1 04:30:11 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 1 04:30:11 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:30:11 localhost python3.9[247679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19656 DF PROTO=TCP SPT=52224 DPT=9102 SEQ=1820219784 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D02D3B80000000001030307) Feb 1 04:30:12 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 1 04:30:12 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 1 04:30:12 localhost sshd[247766]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:30:12 localhost systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully. Feb 1 04:30:12 localhost python3.9[247765]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938210.7449872-99-2388257714326/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=1ed6ed4cf55266b0f1337092c3760ba3bb053253 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:13 localhost nova_compute[225632]: 2026-02-01 09:30:13.313 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:13 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 1 04:30:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:30:13 localhost systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully. Feb 1 04:30:13 localhost podman[247785]: 2026-02-01 09:30:13.668534708 +0000 UTC m=+0.082552862 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:30:13 localhost podman[247785]: 2026-02-01 09:30:13.708486407 +0000 UTC m=+0.122504601 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:30:13 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:30:14 localhost systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully. Feb 1 04:30:14 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 1 04:30:14 localhost systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully. Feb 1 04:30:14 localhost python3.9[247894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62678 DF PROTO=TCP SPT=58900 DPT=9102 SEQ=2297373858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D02DF780000000001030307) Feb 1 04:30:14 localhost python3.9[247980]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938213.7912602-273-70108558974004/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=a74956efcd0a6873aac81fb89a0017e3332e5948 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:15 localhost nova_compute[225632]: 2026-02-01 09:30:15.326 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:15 localhost python3.9[248088]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:30:16 localhost systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully. Feb 1 04:30:16 localhost python3.9[248200]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:17 localhost python3.9[248310]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:17 localhost systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully. Feb 1 04:30:17 localhost systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully. Feb 1 04:30:17 localhost systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully. Feb 1 04:30:17 localhost systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully. Feb 1 04:30:17 localhost python3.9[248367]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:18 localhost python3.9[248477]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:18 localhost nova_compute[225632]: 2026-02-01 09:30:18.315 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:18 localhost systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully. Feb 1 04:30:18 localhost systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully. Feb 1 04:30:18 localhost python3.9[248534]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:19 localhost systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully. Feb 1 04:30:19 localhost systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully. Feb 1 04:30:19 localhost systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully. Feb 1 04:30:19 localhost python3.9[248644]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:19 localhost systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully. Feb 1 04:30:20 localhost python3.9[248754]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:20 localhost nova_compute[225632]: 2026-02-01 09:30:20.336 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:20 localhost systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully. Feb 1 04:30:20 localhost python3.9[248811]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:20 localhost systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully. Feb 1 04:30:20 localhost podman[236886]: @ - - [01/Feb/2026:09:25:43 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 142836 "" "Go-http-client/1.1" Feb 1 04:30:20 localhost podman_exporter[236875]: ts=2026-02-01T09:30:20.996Z caller=exporter.go:96 level=info msg="Listening on" address=:9882 Feb 1 04:30:20 localhost podman_exporter[236875]: ts=2026-02-01T09:30:20.996Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882 Feb 1 04:30:20 localhost podman_exporter[236875]: ts=2026-02-01T09:30:20.996Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882 Feb 1 04:30:21 localhost podman[248930]: 2026-02-01 09:30:21.290967092 +0000 UTC m=+0.091446958 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, name=rhceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:30:21 localhost podman[248930]: 2026-02-01 09:30:21.386546267 +0000 UTC m=+0.187026083 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1764794109, vcs-type=git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z) Feb 1 04:30:21 localhost python3.9[248972]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62679 DF PROTO=TCP SPT=58900 DPT=9102 SEQ=2297373858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D02FFB80000000001030307) Feb 1 04:30:23 localhost nova_compute[225632]: 2026-02-01 09:30:23.317 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:23 localhost python3.9[249160]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:23 localhost podman[236886]: time="2026-02-01T09:30:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:30:23 localhost podman[236886]: @ - - [01/Feb/2026:09:30:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144786 "" "Go-http-client/1.1" Feb 1 04:30:23 localhost podman[236886]: @ - - [01/Feb/2026:09:30:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15908 "" "Go-http-client/1.1" Feb 1 04:30:24 localhost python3.9[249272]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:30:24 localhost systemd[1]: Reloading. Feb 1 04:30:24 localhost systemd-rc-local-generator[249294]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:30:24 localhost systemd-sysv-generator[249301]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:25 localhost nova_compute[225632]: 2026-02-01 09:30:25.375 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:25 localhost python3.9[249419]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:26 localhost python3.9[249476]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:30:26 localhost systemd[1]: tmp-crun.LdA4ZJ.mount: Deactivated successfully. Feb 1 04:30:26 localhost podman[249587]: 2026-02-01 09:30:26.697296473 +0000 UTC m=+0.100802418 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:30:26 localhost podman[249587]: 2026-02-01 09:30:26.710359088 +0000 UTC m=+0.113865033 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:30:26 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:30:26 localhost python3.9[249586]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:27 localhost python3.9[249667]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:28 localhost python3.9[249777]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:30:28 localhost systemd[1]: Reloading. Feb 1 04:30:28 localhost nova_compute[225632]: 2026-02-01 09:30:28.319 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:28 localhost systemd-sysv-generator[249803]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:30:28 localhost systemd-rc-local-generator[249799]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:28 localhost systemd[1]: Starting Create netns directory... Feb 1 04:30:28 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 04:30:28 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 04:30:28 localhost systemd[1]: Finished Create netns directory. Feb 1 04:30:29 localhost python3.9[249930]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:30 localhost nova_compute[225632]: 2026-02-01 09:30:30.418 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:30 localhost python3.9[250040]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:30:31 localhost python3.9[250150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:31 localhost openstack_network_exporter[239441]: ERROR 09:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:30:31 localhost openstack_network_exporter[239441]: Feb 1 04:30:31 localhost openstack_network_exporter[239441]: ERROR 09:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:30:31 localhost openstack_network_exporter[239441]: Feb 1 04:30:31 localhost auditd[725]: Audit daemon rotating log files Feb 1 04:30:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:30:31 localhost systemd[1]: tmp-crun.qMQqO7.mount: Deactivated successfully. Feb 1 04:30:32 localhost podman[250239]: 2026-02-01 09:30:32.002902185 +0000 UTC m=+0.092149232 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:30:32 localhost podman[250239]: 2026-02-01 09:30:32.03520861 +0000 UTC m=+0.124455647 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:30:32 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:30:32 localhost python3.9[250238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938230.9177392-708-193378558280067/.source.json _original_basename=.bb4eixvf follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:32 localhost python3.9[250364]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:33 localhost nova_compute[225632]: 2026-02-01 09:30:33.321 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:30:34 localhost podman[250630]: 2026-02-01 09:30:34.732536524 +0000 UTC m=+0.089385048 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:30:34 localhost podman[250630]: 2026-02-01 09:30:34.743398115 +0000 UTC m=+0.100246579 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:30:34 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:30:35 localhost python3.9[250691]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False Feb 1 04:30:35 localhost nova_compute[225632]: 2026-02-01 09:30:35.453 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:30:35 localhost systemd[1]: tmp-crun.yWLtyT.mount: Deactivated successfully. Feb 1 04:30:35 localhost podman[250692]: 2026-02-01 09:30:35.719700114 +0000 UTC m=+0.080582908 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 04:30:35 localhost podman[250692]: 2026-02-01 09:30:35.782418808 +0000 UTC m=+0.143301602 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true) Feb 1 04:30:35 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:30:36 localhost python3.9[250826]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:30:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30380 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=2252740221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0338C20000000001030307) Feb 1 04:30:38 localhost nova_compute[225632]: 2026-02-01 09:30:38.323 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30381 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=2252740221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D033CB80000000001030307) Feb 1 04:30:38 localhost python3[250936]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json containers=['neutron_sriov_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:30:39 localhost podman[250972]: Feb 1 04:30:39 localhost podman[250972]: 2026-02-01 09:30:39.243060724 +0000 UTC m=+0.083123946 container create 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:30:39 localhost podman[250972]: 2026-02-01 09:30:39.20159998 +0000 UTC m=+0.041663202 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 1 04:30:39 localhost python3[250936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 1 04:30:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62680 DF PROTO=TCP SPT=58900 DPT=9102 SEQ=2297373858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D033FB80000000001030307) Feb 1 04:30:40 localhost python3.9[251120]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:30:40 localhost nova_compute[225632]: 2026-02-01 09:30:40.493 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30382 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=2252740221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0344B80000000001030307) Feb 1 04:30:41 localhost python3.9[251232]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:30:41 localhost podman[251288]: 2026-02-01 09:30:41.4135813 +0000 UTC m=+0.091396559 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.7, release=1769056855, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2026-01-22T05:09:47Z) Feb 1 04:30:41 localhost podman[251288]: 2026-02-01 09:30:41.42898065 +0000 UTC m=+0.106795939 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.7, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7) Feb 1 04:30:41 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:30:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3662 DF PROTO=TCP SPT=36744 DPT=9102 SEQ=1852379497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0347B90000000001030307) Feb 1 04:30:41 localhost python3.9[251287]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:30:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:30:41.689 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:30:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:30:41.689 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:30:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:30:41.691 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:30:42 localhost python3.9[251416]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769938241.578926-942-178339222057139/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:42 localhost python3.9[251471]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:30:42 localhost systemd[1]: Reloading. Feb 1 04:30:42 localhost systemd-rc-local-generator[251496]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:30:42 localhost systemd-sysv-generator[251501]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:43 localhost nova_compute[225632]: 2026-02-01 09:30:43.328 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:43 localhost python3.9[251562]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:30:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:30:43 localhost systemd[1]: Reloading. Feb 1 04:30:43 localhost podman[251564]: 2026-02-01 09:30:43.899786325 +0000 UTC m=+0.086965634 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:30:43 localhost podman[251564]: 2026-02-01 09:30:43.911620527 +0000 UTC m=+0.098799836 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:30:43 localhost systemd-rc-local-generator[251607]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:30:43 localhost systemd-sysv-generator[251615]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:30:44 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:30:44 localhost systemd[1]: Starting neutron_sriov_agent container... Feb 1 04:30:44 localhost systemd[1]: Started libcrun container. Feb 1 04:30:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/191d24994a22f446d2784d4764d8c3c519e4f3438b79f524864d71cf490cf19d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 1 04:30:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/191d24994a22f446d2784d4764d8c3c519e4f3438b79f524864d71cf490cf19d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:30:44 localhost podman[251623]: 2026-02-01 09:30:44.340933732 +0000 UTC m=+0.131540004 container init 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=neutron_sriov_agent) Feb 1 04:30:44 localhost podman[251623]: 2026-02-01 09:30:44.348430041 +0000 UTC m=+0.139036313 container start 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:30:44 localhost podman[251623]: neutron_sriov_agent Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + sudo -E kolla_set_configs Feb 1 04:30:44 localhost systemd[1]: Started neutron_sriov_agent container. Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Validating config file Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Copying service configuration files Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Writing out command to execute Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/8bdf8183-8467-40ac-933d-a37b0bd3539a.conf Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: ++ cat /run_command Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + CMD=/usr/bin/neutron-sriov-nic-agent Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + ARGS= Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + sudo kolla_copy_cacerts Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + [[ ! -n '' ]] Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + . kolla_extend_start Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: Running command: '/usr/bin/neutron-sriov-nic-agent' Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + umask 0022 Feb 1 04:30:44 localhost neutron_sriov_agent[251638]: + exec /usr/bin/neutron-sriov-nic-agent Feb 1 04:30:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30383 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=2252740221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0354780000000001030307) Feb 1 04:30:45 localhost python3.9[251760]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 1 04:30:45 localhost nova_compute[225632]: 2026-02-01 09:30:45.498 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:30:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 5701 writes, 25K keys, 5701 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5701 writes, 740 syncs, 7.70 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:45.999 2 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:45.999 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.000 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.000 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.000 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.000 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.000 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005604212.localdomain'}#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.001 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-39d78afb-701a-429e-889d-12d7eb45a752 - - - - - -] RPC agent_id: nic-switch-agent.np0005604212.localdomain#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.005 2 INFO neutron.agent.agent_extensions_manager [None req-39d78afb-701a-429e-889d-12d7eb45a752 - - - - - -] Loaded agent extensions: ['qos']#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.005 2 INFO neutron.agent.agent_extensions_manager [None req-39d78afb-701a-429e-889d-12d7eb45a752 - - - - - -] Initializing agent extension 'qos'#033[00m Feb 1 04:30:46 localhost python3.9[251871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.447 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-39d78afb-701a-429e-889d-12d7eb45a752 - - - - - -] Agent initialized successfully, now running... #033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.448 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-39d78afb-701a-429e-889d-12d7eb45a752 - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Feb 1 04:30:46 localhost neutron_sriov_agent[251638]: 2026-02-01 09:30:46.448 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-39d78afb-701a-429e-889d-12d7eb45a752 - - - - - -] Agent out of sync with plugin!#033[00m Feb 1 04:30:47 localhost python3.9[251961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938245.9385736-1077-266283572214234/.source.yaml _original_basename=._jhikekb follow=False checksum=b3cbbb2fba8ac1ae44c39a232429364988d5d801 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:30:47 localhost python3.9[252071]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:30:47 localhost systemd[1]: Stopping neutron_sriov_agent container... Feb 1 04:30:48 localhost systemd[1]: libpod-32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd.scope: Deactivated successfully. Feb 1 04:30:48 localhost systemd[1]: libpod-32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd.scope: Consumed 1.750s CPU time. Feb 1 04:30:48 localhost podman[252075]: 2026-02-01 09:30:48.081478557 +0000 UTC m=+0.105519869 container died 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Feb 1 04:30:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd-userdata-shm.mount: Deactivated successfully. Feb 1 04:30:48 localhost systemd[1]: var-lib-containers-storage-overlay-191d24994a22f446d2784d4764d8c3c519e4f3438b79f524864d71cf490cf19d-merged.mount: Deactivated successfully. Feb 1 04:30:48 localhost podman[252075]: 2026-02-01 09:30:48.134868445 +0000 UTC m=+0.158909677 container cleanup 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 1 04:30:48 localhost podman[252075]: neutron_sriov_agent Feb 1 04:30:48 localhost podman[252101]: 2026-02-01 09:30:48.216832265 +0000 UTC m=+0.049710657 container cleanup 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:30:48 localhost podman[252101]: neutron_sriov_agent Feb 1 04:30:48 localhost systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully. Feb 1 04:30:48 localhost systemd[1]: Stopped neutron_sriov_agent container. Feb 1 04:30:48 localhost systemd[1]: Starting neutron_sriov_agent container... Feb 1 04:30:48 localhost nova_compute[225632]: 2026-02-01 09:30:48.330 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:48 localhost systemd[1]: Started libcrun container. Feb 1 04:30:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/191d24994a22f446d2784d4764d8c3c519e4f3438b79f524864d71cf490cf19d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 1 04:30:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/191d24994a22f446d2784d4764d8c3c519e4f3438b79f524864d71cf490cf19d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:30:48 localhost podman[252112]: 2026-02-01 09:30:48.380120256 +0000 UTC m=+0.130024757 container init 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:30:48 localhost podman[252112]: 2026-02-01 09:30:48.38943655 +0000 UTC m=+0.139341041 container start 32e657149a91af46b3e03b1064d9ffb3db90c6dcf3a47a09d7da3a301e7bbcbd (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-487d4d568470de760827e68374a2d62d19f271183e2324fe8971cba1e28cfaa0'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:30:48 localhost podman[252112]: neutron_sriov_agent Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + sudo -E kolla_set_configs Feb 1 04:30:48 localhost systemd[1]: Started neutron_sriov_agent container. Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Validating config file Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Copying service configuration files Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Writing out command to execute Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/8bdf8183-8467-40ac-933d-a37b0bd3539a.conf Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: ++ cat /run_command Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + CMD=/usr/bin/neutron-sriov-nic-agent Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + ARGS= Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + sudo kolla_copy_cacerts Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + [[ ! -n '' ]] Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + . kolla_extend_start Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: Running command: '/usr/bin/neutron-sriov-nic-agent' Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + umask 0022 Feb 1 04:30:48 localhost neutron_sriov_agent[252126]: + exec /usr/bin/neutron-sriov-nic-agent Feb 1 04:30:48 localhost systemd[1]: session-57.scope: Deactivated successfully. Feb 1 04:30:48 localhost systemd[1]: session-57.scope: Consumed 23.185s CPU time. Feb 1 04:30:48 localhost systemd-logind[759]: Session 57 logged out. Waiting for processes to exit. Feb 1 04:30:48 localhost systemd-logind[759]: Removed session 57. Feb 1 04:30:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:30:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 4896 writes, 22K keys, 4896 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4896 writes, 685 syncs, 7.15 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.065 2 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.065 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.065 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.066 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.066 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.066 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.066 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005604212.localdomain'}#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.066 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a8ebc3eb-1775-4692-9291-77e443c71c17 - - - - - -] RPC agent_id: nic-switch-agent.np0005604212.localdomain#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.071 2 INFO neutron.agent.agent_extensions_manager [None req-a8ebc3eb-1775-4692-9291-77e443c71c17 - - - - - -] Loaded agent extensions: ['qos']#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.071 2 INFO neutron.agent.agent_extensions_manager [None req-a8ebc3eb-1775-4692-9291-77e443c71c17 - - - - - -] Initializing agent extension 'qos'#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.199 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a8ebc3eb-1775-4692-9291-77e443c71c17 - - - - - -] Agent initialized successfully, now running... #033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.199 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a8ebc3eb-1775-4692-9291-77e443c71c17 - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Feb 1 04:30:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:30:50.199 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a8ebc3eb-1775-4692-9291-77e443c71c17 - - - - - -] Agent out of sync with plugin!#033[00m Feb 1 04:30:50 localhost nova_compute[225632]: 2026-02-01 09:30:50.543 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30384 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=2252740221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0375B80000000001030307) Feb 1 04:30:53 localhost nova_compute[225632]: 2026-02-01 09:30:53.331 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:53 localhost podman[236886]: time="2026-02-01T09:30:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:30:53 localhost podman[236886]: @ - - [01/Feb/2026:09:30:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146787 "" "Go-http-client/1.1" Feb 1 04:30:53 localhost podman[236886]: @ - - [01/Feb/2026:09:30:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16354 "" "Go-http-client/1.1" Feb 1 04:30:54 localhost sshd[252159]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:30:54 localhost nova_compute[225632]: 2026-02-01 09:30:54.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:30:54 localhost systemd-logind[759]: New session 58 of user zuul. Feb 1 04:30:54 localhost systemd[1]: Started Session 58 of User zuul. Feb 1 04:30:55 localhost python3.9[252270]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:30:55 localhost nova_compute[225632]: 2026-02-01 09:30:55.547 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:56 localhost python3.9[252384]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:30:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:30:57 localhost systemd[1]: tmp-crun.SGW9Iw.mount: Deactivated successfully. Feb 1 04:30:57 localhost podman[252393]: 2026-02-01 09:30:57.742414779 +0000 UTC m=+0.097005250 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:30:57 localhost podman[252393]: 2026-02-01 09:30:57.754299351 +0000 UTC m=+0.108889862 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:30:57 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:30:58 localhost nova_compute[225632]: 2026-02-01 09:30:58.334 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:30:58 localhost nova_compute[225632]: 2026-02-01 09:30:58.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:30:58 localhost nova_compute[225632]: 2026-02-01 09:30:58.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:30:58 localhost python3.9[252470]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:30:59 localhost nova_compute[225632]: 2026-02-01 09:30:59.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:30:59 localhost nova_compute[225632]: 2026-02-01 09:30:59.407 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:30:59 localhost nova_compute[225632]: 2026-02-01 09:30:59.407 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:31:00 localhost nova_compute[225632]: 2026-02-01 09:31:00.479 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:31:00 localhost nova_compute[225632]: 2026-02-01 09:31:00.480 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:31:00 localhost nova_compute[225632]: 2026-02-01 09:31:00.480 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:31:00 localhost nova_compute[225632]: 2026-02-01 09:31:00.480 225636 DEBUG nova.objects.instance [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:31:00 localhost nova_compute[225632]: 2026-02-01 09:31:00.599 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.288 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.306 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.307 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.307 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.307 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.308 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.308 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.308 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.327 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.327 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.328 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.328 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.329 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:31:01 localhost openstack_network_exporter[239441]: ERROR 09:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:31:01 localhost openstack_network_exporter[239441]: Feb 1 04:31:01 localhost openstack_network_exporter[239441]: ERROR 09:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:31:01 localhost openstack_network_exporter[239441]: Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.796 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.870 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:31:01 localhost nova_compute[225632]: 2026-02-01 09:31:01.870 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.093 225636 WARNING nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.094 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12082MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.095 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.095 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.191 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.191 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.192 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.239 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:31:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.649 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.654 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.673 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.674 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:31:02 localhost nova_compute[225632]: 2026-02-01 09:31:02.674 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:31:02 localhost podman[252587]: 2026-02-01 09:31:02.700380289 +0000 UTC m=+0.056757103 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:31:02 localhost podman[252587]: 2026-02-01 09:31:02.729587939 +0000 UTC m=+0.085964723 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true) Feb 1 04:31:02 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:31:03 localhost python3.9[252644]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 1 04:31:03 localhost nova_compute[225632]: 2026-02-01 09:31:03.337 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.520 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.521 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.545 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 58200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0baaed10-93cb-4cf2-825f-09e024b80585', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 58200000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:31:03.522025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ba3c388e-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.764732209, 'message_signature': '2ab861f0e3682920a18551efcd2ce950b6a868d4da2694b709f68a28daaa1a6d'}]}, 'timestamp': '2026-02-01 09:31:03.546363', '_unique_id': 'c6efc241b0ac4b189312256b48684cd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.549 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.552 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e10fdb17-2c6d-4db2-9ce0-1a2214dab20f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.549208', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba3d3748-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': '5d88b80faecfb1ef2c3e1800b6b31370b7ff9f5756ba226bacef841f11cd88ad'}]}, 'timestamp': '2026-02-01 09:31:03.552916', '_unique_id': 'f922425623654cdab30e79e28b53214d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.554 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.555 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.555 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 8782 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f319ab6-3213-4115-ab5a-dfb49e4bf50b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8782, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.555510', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba3db218-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': 'a1823b85b7ba63c6f4f9ccb7fc20293a9ffd1434cd9341ae50a44fda2c90f914'}]}, 'timestamp': '2026-02-01 09:31:03.556023', '_unique_id': '54785b1223754c6dac5af781946a26f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.557 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.558 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.583 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 197023361 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 24174444 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '844f3652-8044-46e3-8ea2-47867247d20a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 197023361, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.558277', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba4211b4-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': 'dab435ffc0af08a7b8ac1228b094f28e034d8e53d2710ffeaa4d3d935e8dd156'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24174444, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.558277', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba422654-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '747371f32f4c2ace5a3c292ecd8ca67a604659627620e0f117f9c1adff925ba5'}]}, 'timestamp': '2026-02-01 09:31:03.585203', '_unique_id': '08472516605a49ffa42ec6d59ee56411'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.588 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.588 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '505623df-42aa-4455-a7aa-958d21d35b85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.588330', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba42b4fc-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': '969323bcb000b37fff4344b83855f8bd67df53f88f29c55bb7688320502e6c7a'}]}, 'timestamp': '2026-02-01 09:31:03.588865', '_unique_id': 'f536f024b0594bd9a34b010e329d8bb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.590 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.591 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0ad85b7-f7f8-42e7-813b-d1535cb46830', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.591055', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba431e60-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': '80ae7cf75b72d63135ae6251ca9b025c3333804fd5badda96544deda357d8067'}]}, 'timestamp': '2026-02-01 09:31:03.591520', '_unique_id': '31d1a4083542485db39f8fbe5e41d36d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.592 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.593 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.593 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 52.3984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a89d9937-9c94-4e97-9290-256a95f7ddfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3984375, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:31:03.593729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ba4386ac-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.764732209, 'message_signature': 'bf4104826e5ad6f08111dec039aa4da206f53069612fa50c9dc9968d953236a7'}]}, 'timestamp': '2026-02-01 09:31:03.594206', '_unique_id': 'b5d9a931ff8248b0bc124be3c762a804'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.596 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52ab80c2-bf6d-4e24-b60d-65717972c36f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.596786', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba43ff92-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': 'ab25123c4cc9f792f5ff06a459e37ed2425835e29672c4f7dc5f8bf003346e7b'}]}, 'timestamp': '2026-02-01 09:31:03.597290', '_unique_id': 'aa5a45abccd041e5a44074db03177850'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.598 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '580bd3fb-0c1d-4e06-b920-22f818256cab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.599424', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba446540-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': '040a4d99d9e3d344b0f6666f503e1d2351320baa75ab2c7745c8bd0e2fda1b57'}]}, 'timestamp': '2026-02-01 09:31:03.599887', '_unique_id': '7abe7a03694543028578881fee534cd3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.602 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 572 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.602 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2d04edc-86f5-46f3-8a7e-697ca3c86d46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 572, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.602033', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba44cb02-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '38f30510b7aaeed660ebeb276cac3f97a0a95f1bce77376a8d8b66bd75fc3aeb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.602033', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba44daf2-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': 'a7684ae1e55f30fa4ec3c6fe72da3c5693df2d827cde7889e863aebcf439ba55'}]}, 'timestamp': '2026-02-01 09:31:03.602866', '_unique_id': 'e62e3dbf38dd444fbf6efa030a9733c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.604 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.605 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.619 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c23eb1d9-3eb2-499f-8b4e-929342a079ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.605195', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba47521e-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.824650228, 'message_signature': 'deeabf9b6d90213c913e94572530a32634657f16d41c52eca600f7495db210f8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.605195', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba47659c-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.824650228, 'message_signature': 'aa46852cfe26a73d87c3efd6d89f64a077cd7970eb1ed2c52746f321c69465cb'}]}, 'timestamp': '2026-02-01 09:31:03.619538', '_unique_id': '9ab58c0118ba4ae0ad12a6dfa44cba68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.622 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1227122553 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 165637656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad8511e7-ebb9-4b12-8d79-4141dded5ba8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1227122553, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.622172', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba47de64-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '562e2f4ffb623330c54de61b273d70b00a099cd791198808b64262b6cbaf4b7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165637656, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.622172', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba47eefe-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '72b114eddc939bb9cbab8c4fee19013dff7e704a82880aafef97cad039574322'}]}, 'timestamp': '2026-02-01 09:31:03.623075', '_unique_id': 'adf328cd7e8f4c4ab72ab339c0b42d33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.625 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.625 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e871dc2-f05c-4264-8e16-dcac0e1566f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.625884', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba48711c-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': '730c356b08ac880a0994313b9044cedf98cdf9f3e410e2f851275ef5acdb8762'}]}, 'timestamp': '2026-02-01 09:31:03.626415', '_unique_id': '00e104f0a0a84f9aa3f42ca477c11301'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.628 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e65a565c-9251-47f0-be06-d9713abd1596', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.628669', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba48dd46-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '17835e3d31cd2701d8953dcbfe8cf55f4990b8af1d08bb43bbbe686b76e03d09'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.628669', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba48ef3e-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '7c0d43029dbd7c264be069b7bba85bc6917187d512993638f61e1f1ea2c2639d'}]}, 'timestamp': '2026-02-01 09:31:03.629645', '_unique_id': '675fa0108ec0410aa267530862017518'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.631 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a75b4c4c-4b4c-4993-97a2-44eaf7c4fedc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.632085', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba496176-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.824650228, 'message_signature': 'db42da490804cdbbb6ba44c1776949aff4cc93120c90bd72ff1811d9bf4d38bc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.632085', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba49730a-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.824650228, 'message_signature': '3159e5b16eac789ec941a3f0dfffdee7a3f81d45cb970f4f27e40c5053ee1fd3'}]}, 'timestamp': '2026-02-01 09:31:03.632980', '_unique_id': '12b26b2446d744e193b00981cdadf77e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e60010c5-207d-4fbd-ad9d-3c2f9fae1306', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.635210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba49dce6-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '5ada57bccce630bbd8dbfd0f274f24ec0f2cddc0284d8085ddabb8c5a1525ffc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.635210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba49ed80-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '23c770f3d9334fbce4a6fd03a2fba3e00c859ebb7e20b7ebbcedd0b3176e1a37'}]}, 'timestamp': '2026-02-01 09:31:03.636148', '_unique_id': '4121f9d5da5b48b6bc5088e755562b95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '626eabf6-605f-49b4-94f4-5128d5a55ee5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.638338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba4a5504-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': 'b727a5a5bd59d6ea5992f1343bc7486c03190da257a056a44c8a8fc52cbcd3ce'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.638338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba4a64d6-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.777756857, 'message_signature': '06bdc3a5c93018ca262e91252f1dad42f1d8a90ee0f163d5db3181b7f10d5e00'}]}, 'timestamp': '2026-02-01 09:31:03.639193', '_unique_id': '43a98c7bcb5a4aa9b019adb3451b0cc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.641 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f15f6791-b25d-4758-9d2c-62d777934f11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.641369', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba4acb9c-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': '976ca9cb2235755bb7634f38e221c84866c48c61caa7991ade2739e4b2148b7f'}]}, 'timestamp': '2026-02-01 09:31:03.641825', '_unique_id': 'a69c6189628348bf8bf3b6923f74a4d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.643 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce287df8-dd5f-4a52-8fdc-d0745ed4dc9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.643878', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba4b2eca-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': 'f7886f39dd560a233e5c51713870ec70cd2f1ba94ac9cc149394efa8863639e6'}]}, 'timestamp': '2026-02-01 09:31:03.644365', '_unique_id': 'fde9fb3360a344a693fa2be287ae2fc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.646 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.646 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.646 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.647 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef19be3c-ad10-4c39-941b-cc42a104d3c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:31:03.646689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ba4b9bb2-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.824650228, 'message_signature': '3b5fef0c28908d46a5afed55125d4cfcc289f79f89f72998a984f9cfbe93370d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:31:03.646689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ba4bacec-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.824650228, 'message_signature': 'e3aafdc42602336c79aad699a24f5e286f146e31dd761c536d5f3c8e7b9ed248'}]}, 'timestamp': '2026-02-01 09:31:03.647567', '_unique_id': '237e1f868c9a42b3800e1a80e06def01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.649 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '219fd7b8-1065-4aaa-a793-71bc87577b88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:31:03.649723', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'ba4c11b4-ff50-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10357.76866327, 'message_signature': '537acab866254ad217648b14ac758fff42a795e16a398c9391867c25a80407f2'}]}, 'timestamp': '2026-02-01 09:31:03.650199', '_unique_id': '021dfff7d3694b268707b00dcabcfe5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 04:31:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:31:03.652 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:31:04 localhost nova_compute[225632]: 2026-02-01 09:31:04.670 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:31:05 localhost podman[252758]: 2026-02-01 09:31:05.094221446 +0000 UTC m=+0.084554010 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:31:05 localhost podman[252758]: 2026-02-01 09:31:05.104062267 +0000 UTC m=+0.094394851 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:31:05 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:31:05 localhost python3.9[252757]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/container-startup-config setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:05 localhost nova_compute[225632]: 2026-02-01 09:31:05.631 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:05 localhost python3.9[252890]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:31:06 localhost podman[253000]: 2026-02-01 09:31:06.418975064 +0000 UTC m=+0.080076754 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:31:06 localhost podman[253000]: 2026-02-01 09:31:06.461581303 +0000 UTC m=+0.122683013 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127) Feb 1 04:31:06 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:31:06 localhost python3.9[253001]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:07 localhost python3.9[253135]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52785 DF PROTO=TCP SPT=59676 DPT=9102 SEQ=330608902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D03ADF10000000001030307) Feb 1 04:31:07 localhost python3.9[253245]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:08 localhost nova_compute[225632]: 2026-02-01 09:31:08.340 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:08 localhost python3.9[253355]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52786 DF PROTO=TCP SPT=59676 DPT=9102 SEQ=330608902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D03B1F80000000001030307) Feb 1 04:31:09 localhost python3.9[253465]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30385 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=2252740221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D03B5B80000000001030307) Feb 1 04:31:10 localhost python3.9[253575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:10 localhost nova_compute[225632]: 2026-02-01 09:31:10.668 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52787 DF PROTO=TCP SPT=59676 DPT=9102 SEQ=330608902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D03B9F80000000001030307) Feb 1 04:31:10 localhost python3.9[253663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938269.4832323-273-194536792210161/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=472c5e922ae22c8bdcaef73d1ca73ce5597b440e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:11 localhost python3.9[253771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:31:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62681 DF PROTO=TCP SPT=58900 DPT=9102 SEQ=2297373858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D03BDB80000000001030307) Feb 1 04:31:11 localhost systemd[1]: tmp-crun.vE5k6u.mount: Deactivated successfully. Feb 1 04:31:11 localhost podman[253788]: 2026-02-01 09:31:11.711812297 +0000 UTC m=+0.076383591 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7) Feb 1 04:31:11 localhost podman[253788]: 2026-02-01 09:31:11.749523328 +0000 UTC m=+0.114094642 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:31:11 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:31:12 localhost python3.9[253876]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938271.0811822-318-259689374900287/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:13 localhost nova_compute[225632]: 2026-02-01 09:31:13.343 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:13 localhost python3.9[253984]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:14 localhost python3.9[254070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938272.255356-318-11251225514443/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:31:14 localhost podman[254175]: 2026-02-01 09:31:14.730185535 +0000 UTC m=+0.088492901 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 1 04:31:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52788 DF PROTO=TCP SPT=59676 DPT=9102 SEQ=330608902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D03C9B80000000001030307) Feb 1 04:31:14 localhost podman[254175]: 2026-02-01 09:31:14.765159321 +0000 UTC m=+0.123466717 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 1 04:31:14 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:31:14 localhost python3.9[254184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:15 localhost nova_compute[225632]: 2026-02-01 09:31:15.670 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:16 localhost python3.9[254283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938274.392514-318-171088833973134/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=67f34d725bddf2d21c25504a47441e09a04b8660 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:17 localhost python3.9[254391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:17 localhost python3.9[254477]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938276.9572651-492-91612403062778/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=a74956efcd0a6873aac81fb89a0017e3332e5948 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:18 localhost nova_compute[225632]: 2026-02-01 09:31:18.344 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:18 localhost python3.9[254585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:19 localhost python3.9[254671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938278.2578065-537-80856039575960/.source follow=False _original_basename=haproxy.j2 checksum=eddfecb822bb60e7241db0fd719c7552d2d25452 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:19 localhost python3.9[254779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:20 localhost python3.9[254865]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938279.3992722-537-6082274045691/.source follow=False _original_basename=dnsmasq.j2 checksum=a6b8b2fb47e7419d250eaee9e3565b13fff8f42e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:20 localhost nova_compute[225632]: 2026-02-01 09:31:20.673 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:21 localhost python3.9[254973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:21 localhost python3.9[255028]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:22 localhost python3.9[255136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:22 localhost python3.9[255222]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938281.7450502-624-231231362679051/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52789 DF PROTO=TCP SPT=59676 DPT=9102 SEQ=330608902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D03E9B90000000001030307) Feb 1 04:31:23 localhost nova_compute[225632]: 2026-02-01 09:31:23.345 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:23 localhost python3.9[255366]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:31:23 localhost podman[236886]: time="2026-02-01T09:31:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:31:23 localhost podman[236886]: @ - - [01/Feb/2026:09:31:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146787 "" "Go-http-client/1.1" Feb 1 04:31:24 localhost podman[236886]: @ - - [01/Feb/2026:09:31:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16355 "" "Go-http-client/1.1" Feb 1 04:31:24 localhost python3.9[255510]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:25 localhost python3.9[255638]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:25 localhost python3.9[255695]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:25 localhost nova_compute[225632]: 2026-02-01 09:31:25.727 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:26 localhost python3.9[255805]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:26 localhost python3.9[255862]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:31:27 localhost systemd[1]: tmp-crun.Fd5QWy.mount: Deactivated successfully. Feb 1 04:31:27 localhost podman[255972]: 2026-02-01 09:31:27.989468903 +0000 UTC m=+0.094158584 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:31:28 localhost podman[255972]: 2026-02-01 09:31:28.005967686 +0000 UTC m=+0.110657327 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:31:28 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:31:28 localhost python3.9[255973]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:28 localhost nova_compute[225632]: 2026-02-01 09:31:28.348 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:28 localhost python3.9[256105]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:29 localhost python3.9[256162]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:30 localhost python3.9[256272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:30 localhost nova_compute[225632]: 2026-02-01 09:31:30.766 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:30 localhost python3.9[256329]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:31 localhost openstack_network_exporter[239441]: ERROR 09:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:31:31 localhost openstack_network_exporter[239441]: Feb 1 04:31:31 localhost openstack_network_exporter[239441]: ERROR 09:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:31:31 localhost openstack_network_exporter[239441]: Feb 1 04:31:32 localhost python3.9[256439]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:31:32 localhost systemd[1]: Reloading. Feb 1 04:31:32 localhost systemd-rc-local-generator[256463]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:31:32 localhost systemd-sysv-generator[256467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:32 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:33 localhost nova_compute[225632]: 2026-02-01 09:31:33.366 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:31:33 localhost podman[256495]: 2026-02-01 09:31:33.717917412 +0000 UTC m=+0.073215900 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true) Feb 1 04:31:33 localhost podman[256495]: 2026-02-01 09:31:33.723299088 +0000 UTC m=+0.078597646 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:31:33 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:31:34 localhost python3.9[256605]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:35 localhost python3.9[256662]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:31:35 localhost podman[256751]: 2026-02-01 09:31:35.738092846 +0000 UTC m=+0.088014488 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:31:35 localhost nova_compute[225632]: 2026-02-01 09:31:35.769 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:35 localhost podman[256751]: 2026-02-01 09:31:35.773340133 +0000 UTC m=+0.123261835 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:31:35 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:31:35 localhost python3.9[256794]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:36 localhost python3.9[256851]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:31:36 localhost podman[256885]: 2026-02-01 09:31:36.724973246 +0000 UTC m=+0.082347311 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3) Feb 1 04:31:36 localhost podman[256885]: 2026-02-01 09:31:36.768540241 +0000 UTC m=+0.125914316 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:31:36 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:31:37 localhost python3.9[256987]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:31:37 localhost systemd[1]: Reloading. Feb 1 04:31:37 localhost systemd-sysv-generator[257019]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:31:37 localhost systemd-rc-local-generator[257016]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15564 DF PROTO=TCP SPT=54406 DPT=9102 SEQ=4146690396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0423220000000001030307) Feb 1 04:31:37 localhost systemd[1]: Starting Create netns directory... Feb 1 04:31:37 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 1 04:31:37 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 1 04:31:37 localhost systemd[1]: Finished Create netns directory. Feb 1 04:31:38 localhost sshd[257048]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:31:38 localhost nova_compute[225632]: 2026-02-01 09:31:38.401 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15565 DF PROTO=TCP SPT=54406 DPT=9102 SEQ=4146690396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0427380000000001030307) Feb 1 04:31:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52790 DF PROTO=TCP SPT=59676 DPT=9102 SEQ=330608902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0429B80000000001030307) Feb 1 04:31:39 localhost python3.9[257142]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:40 localhost python3.9[257252]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:31:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15566 DF PROTO=TCP SPT=54406 DPT=9102 SEQ=4146690396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D042F380000000001030307) Feb 1 04:31:40 localhost nova_compute[225632]: 2026-02-01 09:31:40.773 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:41 localhost python3.9[257362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:31:41.691 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:31:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:31:41.691 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:31:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:31:41.692 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:31:41 localhost python3.9[257450]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938300.7393012-1092-25673385625006/.source.json _original_basename=.zp7zhvu6 follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30386 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=2252740221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0433B80000000001030307) Feb 1 04:31:42 localhost python3.9[257558]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:31:42 localhost podman[257576]: 2026-02-01 09:31:42.708040258 +0000 UTC m=+0.070181737 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:31:42 localhost podman[257576]: 2026-02-01 09:31:42.724492296 +0000 UTC m=+0.086633815 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, name=ubi9/ubi-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, version=9.7, io.openshift.tags=minimal rhel9, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:31:42 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:31:43 localhost nova_compute[225632]: 2026-02-01 09:31:43.432 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15567 DF PROTO=TCP SPT=54406 DPT=9102 SEQ=4146690396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D043EF80000000001030307) Feb 1 04:31:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:31:45 localhost podman[257806]: 2026-02-01 09:31:45.711813131 +0000 UTC m=+0.067807793 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:31:45 localhost podman[257806]: 2026-02-01 09:31:45.745627464 +0000 UTC m=+0.101622096 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 1 04:31:45 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:31:45 localhost nova_compute[225632]: 2026-02-01 09:31:45.777 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:46 localhost python3.9[257900]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Feb 1 04:31:47 localhost python3.9[258010]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:31:48 localhost python3[258120]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json containers=['neutron_dhcp_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:31:48 localhost nova_compute[225632]: 2026-02-01 09:31:48.432 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:48 localhost podman[258156]: Feb 1 04:31:48 localhost podman[258156]: 2026-02-01 09:31:48.637092282 +0000 UTC m=+0.078047889 container create d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_id=neutron_dhcp) Feb 1 04:31:48 localhost podman[258156]: 2026-02-01 09:31:48.594116326 +0000 UTC m=+0.035071943 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:31:48 localhost python3[258120]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:31:49 localhost python3.9[258303]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:31:50 localhost python3.9[258415]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:50 localhost python3.9[258470]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:31:50 localhost nova_compute[225632]: 2026-02-01 09:31:50.813 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:51 localhost python3.9[258579]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769938310.690904-1326-100575413493628/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:51 localhost python3.9[258634]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:31:51 localhost systemd[1]: Reloading. Feb 1 04:31:51 localhost systemd-rc-local-generator[258657]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:31:51 localhost systemd-sysv-generator[258662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:52 localhost python3.9[258724]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:31:52 localhost systemd[1]: Reloading. Feb 1 04:31:52 localhost systemd-rc-local-generator[258747]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:31:52 localhost systemd-sysv-generator[258751]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:31:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:31:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15568 DF PROTO=TCP SPT=54406 DPT=9102 SEQ=4146690396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D045FB80000000001030307) Feb 1 04:31:53 localhost systemd[1]: Starting neutron_dhcp_agent container... Feb 1 04:31:53 localhost systemd[1]: Started libcrun container. Feb 1 04:31:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70a4679f6edec087cccb52bcc5cf662d133043e4f4212e633d98270f2700bea6/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 1 04:31:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70a4679f6edec087cccb52bcc5cf662d133043e4f4212e633d98270f2700bea6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:31:53 localhost podman[258764]: 2026-02-01 09:31:53.323915337 +0000 UTC m=+0.127422292 container init d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:31:53 localhost podman[258764]: 2026-02-01 09:31:53.333665298 +0000 UTC m=+0.137172253 container start d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=neutron_dhcp, container_name=neutron_dhcp_agent) Feb 1 04:31:53 localhost podman[258764]: neutron_dhcp_agent Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + sudo -E kolla_set_configs Feb 1 04:31:53 localhost systemd[1]: Started neutron_dhcp_agent container. Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Validating config file Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Copying service configuration files Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Writing out command to execute Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/8bdf8183-8467-40ac-933d-a37b0bd3539a.conf Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: ++ cat /run_command Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + CMD=/usr/bin/neutron-dhcp-agent Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + ARGS= Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + sudo kolla_copy_cacerts Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + [[ ! -n '' ]] Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + . kolla_extend_start Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: Running command: '/usr/bin/neutron-dhcp-agent' Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + umask 0022 Feb 1 04:31:53 localhost neutron_dhcp_agent[258779]: + exec /usr/bin/neutron-dhcp-agent Feb 1 04:31:53 localhost nova_compute[225632]: 2026-02-01 09:31:53.477 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:53 localhost podman[236886]: time="2026-02-01T09:31:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:31:53 localhost podman[236886]: @ - - [01/Feb/2026:09:31:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149139 "" "Go-http-client/1.1" Feb 1 04:31:53 localhost podman[236886]: @ - - [01/Feb/2026:09:31:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16782 "" "Go-http-client/1.1" Feb 1 04:31:54 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:54.647 258783 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 1 04:31:54 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:54.648 258783 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44#033[00m Feb 1 04:31:54 localhost python3.9[258902]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 1 04:31:55 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:55.030 258783 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 1 04:31:55 localhost nova_compute[225632]: 2026-02-01 09:31:55.817 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:55 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:55.900 258783 INFO neutron.agent.dhcp.agent [None req-4093fe9c-96c1-4bc1-b5df-19baadfcb39c - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:31:55 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:55.901 258783 INFO neutron.agent.dhcp.agent [-] Starting network 8bdf8183-8467-40ac-933d-a37b0bd3539a dhcp configuration#033[00m Feb 1 04:31:56 localhost nova_compute[225632]: 2026-02-01 09:31:56.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:56 localhost python3.9[259013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:31:56 localhost nova_compute[225632]: 2026-02-01 09:31:56.433 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:31:56.433 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:31:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:31:56.435 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:31:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:31:56.436 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:31:56 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:56.783 258783 INFO oslo.privsep.daemon [None req-3a55bd88-e522-4832-8266-fdaf0f5d454c - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpkjndtm8i/privsep.sock']#033[00m Feb 1 04:31:57 localhost python3.9[259106]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938315.8171804-1461-78944774046573/.source.yaml _original_basename=.ais8bfg5 follow=False checksum=552a83c15bca59d2cd0078e31025ce01db8bbba5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:31:57 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:57.380 258783 INFO oslo.privsep.daemon [None req-3a55bd88-e522-4832-8266-fdaf0f5d454c - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:31:57 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:57.277 259108 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:31:57 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:57.281 259108 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:31:57 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:57.284 259108 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Feb 1 04:31:57 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:57.285 259108 INFO oslo.privsep.daemon [-] privsep daemon running as pid 259108#033[00m Feb 1 04:31:57 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:57.866 258783 INFO oslo.privsep.daemon [None req-3a55bd88-e522-4832-8266-fdaf0f5d454c - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmphab7n5op/privsep.sock']#033[00m Feb 1 04:31:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:31:58 localhost podman[259226]: 2026-02-01 09:31:58.228311456 +0000 UTC m=+0.079841165 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:31:58 localhost podman[259226]: 2026-02-01 09:31:58.23656434 +0000 UTC m=+0.088094099 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:31:58 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:31:58 localhost nova_compute[225632]: 2026-02-01 09:31:58.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:58 localhost python3.9[259227]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:31:58 localhost nova_compute[225632]: 2026-02-01 09:31:58.504 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:31:58 localhost systemd[1]: Stopping neutron_dhcp_agent container... Feb 1 04:31:58 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:58.531 258783 INFO oslo.privsep.daemon [None req-3a55bd88-e522-4832-8266-fdaf0f5d454c - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:31:58 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:58.399 259249 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:31:58 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:58.406 259249 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:31:58 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:58.411 259249 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Feb 1 04:31:58 localhost neutron_dhcp_agent[258779]: 2026-02-01 09:31:58.411 259249 INFO oslo.privsep.daemon [-] privsep daemon running as pid 259249#033[00m Feb 1 04:31:58 localhost systemd[1]: libpod-d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a.scope: Deactivated successfully. Feb 1 04:31:58 localhost systemd[1]: libpod-d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a.scope: Consumed 3.611s CPU time. Feb 1 04:31:58 localhost podman[259253]: 2026-02-01 09:31:58.62187372 +0000 UTC m=+0.093105325 container died d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, tcib_managed=true, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:31:58 localhost systemd[1]: tmp-crun.LMxlCU.mount: Deactivated successfully. Feb 1 04:31:58 localhost podman[259253]: 2026-02-01 09:31:58.677475125 +0000 UTC m=+0.148706710 container cleanup d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=neutron_dhcp, org.label-schema.schema-version=1.0) Feb 1 04:31:58 localhost podman[259253]: neutron_dhcp_agent Feb 1 04:31:58 localhost podman[259298]: error opening file `/run/crun/d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a/status`: No such file or directory Feb 1 04:31:58 localhost podman[259285]: 2026-02-01 09:31:58.788056317 +0000 UTC m=+0.073272832 container cleanup d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:31:58 localhost podman[259285]: neutron_dhcp_agent Feb 1 04:31:58 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Feb 1 04:31:58 localhost systemd[1]: Stopped neutron_dhcp_agent container. Feb 1 04:31:58 localhost systemd[1]: Starting neutron_dhcp_agent container... Feb 1 04:31:58 localhost systemd[1]: Started libcrun container. Feb 1 04:31:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70a4679f6edec087cccb52bcc5cf662d133043e4f4212e633d98270f2700bea6/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 1 04:31:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70a4679f6edec087cccb52bcc5cf662d133043e4f4212e633d98270f2700bea6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:31:58 localhost podman[259300]: 2026-02-01 09:31:58.94014403 +0000 UTC m=+0.112212354 container init d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 04:31:58 localhost podman[259300]: 2026-02-01 09:31:58.949400016 +0000 UTC m=+0.121468340 container start d85d5627603405cf6c96552516925b9ac51a60e4b8707a4f5b05ff91f17e223a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-ce38c384efbc4d4804ea8e547466a9df29c1ae9ac947022806f2ce78e3dee1d4'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.vendor=CentOS) Feb 1 04:31:58 localhost podman[259300]: neutron_dhcp_agent Feb 1 04:31:58 localhost neutron_dhcp_agent[259316]: + sudo -E kolla_set_configs Feb 1 04:31:58 localhost systemd[1]: Started neutron_dhcp_agent container. Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Validating config file Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Copying service configuration files Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Writing out command to execute Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/8bdf8183-8467-40ac-933d-a37b0bd3539a.conf Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp/8bdf8183-8467-40ac-933d-a37b0bd3539a Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: ++ cat /run_command Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + CMD=/usr/bin/neutron-dhcp-agent Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + ARGS= Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + sudo kolla_copy_cacerts Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + [[ ! -n '' ]] Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + . kolla_extend_start Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: Running command: '/usr/bin/neutron-dhcp-agent' Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + umask 0022 Feb 1 04:31:59 localhost neutron_dhcp_agent[259316]: + exec /usr/bin/neutron-dhcp-agent Feb 1 04:31:59 localhost nova_compute[225632]: 2026-02-01 09:31:59.403 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:59 localhost nova_compute[225632]: 2026-02-01 09:31:59.432 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:59 localhost nova_compute[225632]: 2026-02-01 09:31:59.433 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:59 localhost nova_compute[225632]: 2026-02-01 09:31:59.433 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:31:59 localhost nova_compute[225632]: 2026-02-01 09:31:59.434 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:31:59 localhost systemd[1]: session-58.scope: Deactivated successfully. Feb 1 04:31:59 localhost systemd[1]: session-58.scope: Consumed 34.733s CPU time. Feb 1 04:31:59 localhost systemd-logind[759]: Session 58 logged out. Waiting for processes to exit. Feb 1 04:31:59 localhost systemd-logind[759]: Removed session 58. Feb 1 04:32:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:00.212 259320 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 1 04:32:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:00.213 259320 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.408 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.434 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.435 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.435 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.436 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.436 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:32:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:00.598 259320 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.863 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.905 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.975 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:32:00 localhost nova_compute[225632]: 2026-02-01 09:32:00.976 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.193 225636 WARNING nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.195 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12202MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.195 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.196 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.260 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.260 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.261 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.304 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:32:01 localhost openstack_network_exporter[239441]: ERROR 09:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:32:01 localhost openstack_network_exporter[239441]: Feb 1 04:32:01 localhost openstack_network_exporter[239441]: ERROR 09:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:32:01 localhost openstack_network_exporter[239441]: Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.753 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.759 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.879 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.882 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:32:01 localhost nova_compute[225632]: 2026-02-01 09:32:01.882 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.023 259320 INFO neutron.agent.dhcp.agent [None req-4c595a5e-7a75-4735-a4c4-4199652bf3fb - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.024 259320 INFO neutron.agent.dhcp.agent [-] Starting network 8bdf8183-8467-40ac-933d-a37b0bd3539a dhcp configuration#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.078 259320 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpsk1f24v1/privsep.sock']#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.704 259320 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.608 259397 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.613 259397 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.616 259397 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Feb 1 04:32:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:02.617 259397 INFO oslo.privsep.daemon [-] privsep daemon running as pid 259397#033[00m Feb 1 04:32:02 localhost nova_compute[225632]: 2026-02-01 09:32:02.882 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:32:02 localhost nova_compute[225632]: 2026-02-01 09:32:02.883 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:32:02 localhost nova_compute[225632]: 2026-02-01 09:32:02.884 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:32:02 localhost nova_compute[225632]: 2026-02-01 09:32:02.884 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:32:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:03.233 259320 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpjek8u4x2/privsep.sock']#033[00m Feb 1 04:32:03 localhost nova_compute[225632]: 2026-02-01 09:32:03.505 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:03 localhost nova_compute[225632]: 2026-02-01 09:32:03.558 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:32:03 localhost nova_compute[225632]: 2026-02-01 09:32:03.558 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:32:03 localhost nova_compute[225632]: 2026-02-01 09:32:03.558 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:32:03 localhost nova_compute[225632]: 2026-02-01 09:32:03.559 225636 DEBUG nova.objects.instance [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:32:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:03.763 259320 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:32:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:03.677 259406 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:32:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:03.680 259406 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:32:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:03.681 259406 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Feb 1 04:32:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:03.682 259406 INFO oslo.privsep.daemon [-] privsep daemon running as pid 259406#033[00m Feb 1 04:32:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:32:04 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:04.646 259320 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpehuo__2w/privsep.sock']#033[00m Feb 1 04:32:04 localhost nova_compute[225632]: 2026-02-01 09:32:04.674 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:32:04 localhost nova_compute[225632]: 2026-02-01 09:32:04.714 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:32:04 localhost nova_compute[225632]: 2026-02-01 09:32:04.715 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:32:04 localhost podman[259414]: 2026-02-01 09:32:04.719129594 +0000 UTC m=+0.074044835 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:32:04 localhost podman[259414]: 2026-02-01 09:32:04.723415557 +0000 UTC m=+0.078330788 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 1 04:32:04 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:32:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:05.273 259320 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:32:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:05.184 259437 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:32:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:05.190 259437 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:32:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:05.193 259437 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Feb 1 04:32:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:05.193 259437 INFO oslo.privsep.daemon [-] privsep daemon running as pid 259437#033[00m Feb 1 04:32:05 localhost nova_compute[225632]: 2026-02-01 09:32:05.901 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:06 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:06.577 259320 INFO neutron.agent.linux.ip_lib [-] Device tap8591dd3a-60 cannot be used as it has no MAC address#033[00m Feb 1 04:32:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:32:06 localhost nova_compute[225632]: 2026-02-01 09:32:06.631 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:06 localhost kernel: device tap8591dd3a-60 entered promiscuous mode Feb 1 04:32:06 localhost NetworkManager[5964]: [1769938326.6379] manager: (tap8591dd3a-60): new Generic device (/org/freedesktop/NetworkManager/Devices/15) Feb 1 04:32:06 localhost nova_compute[225632]: 2026-02-01 09:32:06.639 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:06 localhost ovn_controller[152492]: 2026-02-01T09:32:06Z|00046|binding|INFO|Claiming lport 8591dd3a-606b-4363-9bf6-7002e09791c4 for this chassis. Feb 1 04:32:06 localhost ovn_controller[152492]: 2026-02-01T09:32:06Z|00047|binding|INFO|8591dd3a-606b-4363-9bf6-7002e09791c4: Claiming unknown Feb 1 04:32:06 localhost systemd-udevd[259458]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:32:06 localhost ovn_controller[152492]: 2026-02-01T09:32:06Z|00048|binding|INFO|Setting lport 8591dd3a-606b-4363-9bf6-7002e09791c4 ovn-installed in OVS Feb 1 04:32:06 localhost ovn_controller[152492]: 2026-02-01T09:32:06Z|00049|binding|INFO|Setting lport 8591dd3a-606b-4363-9bf6-7002e09791c4 up in Southbound Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.651 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.3/24', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79df39cba1c14309b68e8b61518619fd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5329260b-b0db-417b-bda6-9045427ce15d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=8591dd3a-606b-4363-9bf6-7002e09791c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:32:06 localhost nova_compute[225632]: 2026-02-01 09:32:06.651 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.656 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 8591dd3a-606b-4363-9bf6-7002e09791c4 in datapath 8bdf8183-8467-40ac-933d-a37b0bd3539a bound to our chassis#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.658 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6aaab6a9-3538-4fc9-b08e-a42b74cabd90 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.658 158365 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8bdf8183-8467-40ac-933d-a37b0bd3539a#033[00m Feb 1 04:32:06 localhost nova_compute[225632]: 2026-02-01 09:32:06.673 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.673 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[8077b33b-6830-419c-9aad-3384b8239b78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.701 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[e799941c-6fc8-4283-a14c-25f5655db97c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.707 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[8faf1b05-2337-4723-bcb0-40b5c40a427d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:32:06 localhost podman[259451]: 2026-02-01 09:32:06.710333064 +0000 UTC m=+0.071610241 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:32:06 localhost nova_compute[225632]: 2026-02-01 09:32:06.726 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.728 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[aab550f2-ed9f-4d88-938a-4f2e25ab8bc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.741 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d918baf5-d0d0-40fc-9f89-da0478adcfe1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8bdf8183-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:29:7a:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 105, 'tx_packets': 73, 'rx_bytes': 9016, 'tx_bytes': 7516, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 105, 'tx_packets': 73, 'rx_bytes': 9016, 'tx_bytes': 7516, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635606, 'reachable_time': 26469, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 7, 'inoctets': 524, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 20, 'outoctets': 1412, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 20, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 524, 'outmcastoctets': 1412, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 20, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259491, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:32:06 localhost podman[259451]: 2026-02-01 09:32:06.749312417 +0000 UTC m=+0.110589574 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.757 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9ff03f-0f10-450c-b569-f225ed054d22]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8bdf8183-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635617, 'tstamp': 635617}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259493, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap8bdf8183-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 635619, 'tstamp': 635619}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259493, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:32:06 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.760 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bdf8183-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:32:06 localhost nova_compute[225632]: 2026-02-01 09:32:06.761 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.762 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bdf8183-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.763 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.763 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8bdf8183-80, col_values=(('external_ids', {'iface-id': 'a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:32:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:06.763 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:32:07 localhost systemd[1]: tmp-crun.zaqs2C.mount: Deactivated successfully. Feb 1 04:32:07 localhost podman[259518]: 2026-02-01 09:32:07.344159891 +0000 UTC m=+0.082675511 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Feb 1 04:32:07 localhost podman[259518]: 2026-02-01 09:32:07.379271905 +0000 UTC m=+0.117787465 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 1 04:32:07 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:32:07 localhost podman[259564]: Feb 1 04:32:07 localhost podman[259564]: 2026-02-01 09:32:07.596454176 +0000 UTC m=+0.095344043 container create 10bd14a3a74b4dbdba82a70dd4a0f001344a0ed6624450d870564cfbccb02e2b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8bdf8183-8467-40ac-933d-a37b0bd3539a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:32:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7701 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=3839871908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0498520000000001030307) Feb 1 04:32:07 localhost systemd[1]: Started libpod-conmon-10bd14a3a74b4dbdba82a70dd4a0f001344a0ed6624450d870564cfbccb02e2b.scope. Feb 1 04:32:07 localhost podman[259564]: 2026-02-01 09:32:07.548332641 +0000 UTC m=+0.047222548 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:32:07 localhost systemd[1]: Started libcrun container. Feb 1 04:32:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bf15e37c53f2f66ee4fb40c0253f91973ded696e1839ab3f72e5e6f294f1400/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:32:07 localhost podman[259564]: 2026-02-01 09:32:07.66983568 +0000 UTC m=+0.168725547 container init 10bd14a3a74b4dbdba82a70dd4a0f001344a0ed6624450d870564cfbccb02e2b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8bdf8183-8467-40ac-933d-a37b0bd3539a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:32:07 localhost podman[259564]: 2026-02-01 09:32:07.678480738 +0000 UTC m=+0.177370605 container start 10bd14a3a74b4dbdba82a70dd4a0f001344a0ed6624450d870564cfbccb02e2b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8bdf8183-8467-40ac-933d-a37b0bd3539a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:32:07 localhost dnsmasq[259582]: started, version 2.85 cachesize 150 Feb 1 04:32:07 localhost dnsmasq[259582]: DNS service limited to local subnets Feb 1 04:32:07 localhost dnsmasq[259582]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:32:07 localhost dnsmasq[259582]: warning: no upstream servers configured Feb 1 04:32:07 localhost dnsmasq-dhcp[259582]: DHCP, static leases only on 192.168.0.0, lease time 1d Feb 1 04:32:07 localhost dnsmasq[259582]: read /var/lib/neutron/dhcp/8bdf8183-8467-40ac-933d-a37b0bd3539a/addn_hosts - 2 addresses Feb 1 04:32:07 localhost dnsmasq-dhcp[259582]: read /var/lib/neutron/dhcp/8bdf8183-8467-40ac-933d-a37b0bd3539a/host Feb 1 04:32:07 localhost dnsmasq-dhcp[259582]: read /var/lib/neutron/dhcp/8bdf8183-8467-40ac-933d-a37b0bd3539a/opts Feb 1 04:32:07 localhost podman[259598]: 2026-02-01 09:32:07.905299745 +0000 UTC m=+0.061156367 container kill 0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 1 04:32:07 localhost systemd[1]: tmp-crun.OPzlHR.mount: Deactivated successfully. Feb 1 04:32:07 localhost systemd[1]: libpod-0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94.scope: Deactivated successfully. Feb 1 04:32:07 localhost podman[259612]: 2026-02-01 09:32:07.986101209 +0000 UTC m=+0.056574727 container died 0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64) Feb 1 04:32:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94-userdata-shm.mount: Deactivated successfully. Feb 1 04:32:08 localhost podman[259612]: 2026-02-01 09:32:08.134016093 +0000 UTC m=+0.204489581 container remove 0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git) Feb 1 04:32:08 localhost systemd[1]: libpod-conmon-0620c672c1ac314e5f2fbd4f055ea66b15fc185b1c01b8a8380a3fa584600c94.scope: Deactivated successfully. Feb 1 04:32:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:08.144 259320 INFO neutron.agent.dhcp.agent [None req-1a24bebe-975f-4576-a13e-49106c777c0b - - - - - -] Finished network 8bdf8183-8467-40ac-933d-a37b0bd3539a dhcp configuration#033[00m Feb 1 04:32:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:08.145 259320 INFO neutron.agent.dhcp.agent [None req-4c595a5e-7a75-4735-a4c4-4199652bf3fb - - - - - -] Synchronizing state complete#033[00m Feb 1 04:32:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:08.211 259320 INFO neutron.agent.dhcp.agent [None req-4c595a5e-7a75-4735-a4c4-4199652bf3fb - - - - - -] DHCP agent started#033[00m Feb 1 04:32:08 localhost nova_compute[225632]: 2026-02-01 09:32:08.540 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:32:08.539 259320 INFO neutron.agent.dhcp.agent [None req-ad0bbb19-7271-431d-8379-0aa65a7ae40e - - - - - -] DHCP configuration for ports {'8591dd3a-606b-4363-9bf6-7002e09791c4', 'a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5', '390ff798-310c-462d-88e6-ddb169b10079', '09cac1be-46e2-4a31-8306-e6f4f0401b19'} is completed#033[00m Feb 1 04:32:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7702 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=3839871908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D049C780000000001030307) Feb 1 04:32:08 localhost systemd[1]: var-lib-containers-storage-overlay-3f4519e1d60938f46fa5b10f3261da193174045579639569b6894dc5164c3378-merged.mount: Deactivated successfully. Feb 1 04:32:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15569 DF PROTO=TCP SPT=54406 DPT=9102 SEQ=4146690396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D049FB80000000001030307) Feb 1 04:32:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7703 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=3839871908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D04A4780000000001030307) Feb 1 04:32:10 localhost nova_compute[225632]: 2026-02-01 09:32:10.951 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52791 DF PROTO=TCP SPT=59676 DPT=9102 SEQ=330608902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D04A7B80000000001030307) Feb 1 04:32:13 localhost nova_compute[225632]: 2026-02-01 09:32:13.575 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:32:13 localhost podman[259641]: 2026-02-01 09:32:13.715169942 +0000 UTC m=+0.074967383 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, release=1769056855, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, vcs-type=git, version=9.7, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:32:13 localhost podman[259641]: 2026-02-01 09:32:13.731444905 +0000 UTC m=+0.091242346 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, release=1769056855, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.7) Feb 1 04:32:13 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:32:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7704 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=3839871908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D04B4380000000001030307) Feb 1 04:32:15 localhost nova_compute[225632]: 2026-02-01 09:32:15.990 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:32:16 localhost podman[259661]: 2026-02-01 09:32:16.719525993 +0000 UTC m=+0.083685413 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 04:32:16 localhost podman[259661]: 2026-02-01 09:32:16.73691124 +0000 UTC m=+0.101070690 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127) Feb 1 04:32:16 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:32:18 localhost nova_compute[225632]: 2026-02-01 09:32:18.610 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:21 localhost nova_compute[225632]: 2026-02-01 09:32:21.029 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7705 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=3839871908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D04D3B80000000001030307) Feb 1 04:32:23 localhost nova_compute[225632]: 2026-02-01 09:32:23.636 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:23 localhost podman[236886]: time="2026-02-01T09:32:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:32:23 localhost podman[236886]: @ - - [01/Feb/2026:09:32:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148627 "" "Go-http-client/1.1" Feb 1 04:32:24 localhost podman[236886]: @ - - [01/Feb/2026:09:32:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16791 "" "Go-http-client/1.1" Feb 1 04:32:26 localhost nova_compute[225632]: 2026-02-01 09:32:26.068 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:32:28 localhost nova_compute[225632]: 2026-02-01 09:32:28.677 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:28 localhost podman[259767]: 2026-02-01 09:32:28.710845043 +0000 UTC m=+0.070678512 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:32:28 localhost podman[259767]: 2026-02-01 09:32:28.716776295 +0000 UTC m=+0.076609854 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:32:28 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:32:31 localhost nova_compute[225632]: 2026-02-01 09:32:31.114 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:31 localhost openstack_network_exporter[239441]: ERROR 09:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:32:31 localhost openstack_network_exporter[239441]: Feb 1 04:32:31 localhost openstack_network_exporter[239441]: ERROR 09:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:32:31 localhost openstack_network_exporter[239441]: Feb 1 04:32:33 localhost nova_compute[225632]: 2026-02-01 09:32:33.701 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:32:35 localhost systemd[1]: tmp-crun.xSk1Dy.mount: Deactivated successfully. Feb 1 04:32:35 localhost podman[259790]: 2026-02-01 09:32:35.727963268 +0000 UTC m=+0.087881792 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:32:35 localhost podman[259790]: 2026-02-01 09:32:35.764514086 +0000 UTC m=+0.124432660 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Feb 1 04:32:35 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:32:36 localhost nova_compute[225632]: 2026-02-01 09:32:36.165 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:36 localhost ovn_controller[152492]: 2026-02-01T09:32:36Z|00050|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory Feb 1 04:32:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:32:37 localhost podman[259809]: 2026-02-01 09:32:37.353753554 +0000 UTC m=+0.080591908 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:32:37 localhost podman[259809]: 2026-02-01 09:32:37.364297828 +0000 UTC m=+0.091136222 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:32:37 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:32:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33550 DF PROTO=TCP SPT=36090 DPT=9102 SEQ=1830256711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D050D820000000001030307) Feb 1 04:32:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:32:37 localhost podman[259832]: 2026-02-01 09:32:37.727367861 +0000 UTC m=+0.078202304 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:32:37 localhost podman[259832]: 2026-02-01 09:32:37.790593592 +0000 UTC m=+0.141428045 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 1 04:32:37 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:32:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33551 DF PROTO=TCP SPT=36090 DPT=9102 SEQ=1830256711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0511780000000001030307) Feb 1 04:32:38 localhost nova_compute[225632]: 2026-02-01 09:32:38.735 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7706 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=3839871908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0513B80000000001030307) Feb 1 04:32:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33552 DF PROTO=TCP SPT=36090 DPT=9102 SEQ=1830256711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0519780000000001030307) Feb 1 04:32:41 localhost nova_compute[225632]: 2026-02-01 09:32:41.204 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.692 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.693 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.694 158365 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.694 158365 ERROR neutron.agent.linux.external_process [-] metadata-proxy for metadata with uuid 8bdf8183-8467-40ac-933d-a37b0bd3539a not found. The process should not have died#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.694 158365 WARNING neutron.agent.linux.external_process [-] Respawning metadata-proxy for uuid 8bdf8183-8467-40ac-933d-a37b0bd3539a#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.694 158365 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.699 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[967f3b35-e1da-49db-9348-2c2abba3e43a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.701 158365 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: global Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: log /dev/log local0 debug Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: log-tag haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: user root Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: group root Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: maxconn 1024 Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: pidfile /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: daemon Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: defaults Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: log global Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: mode http Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: option httplog Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: option dontlognull Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: option http-server-close Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: option forwardfor Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: retries 3 Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: timeout http-request 30s Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: timeout connect 30s Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: timeout client 32s Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: timeout server 32s Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: timeout http-keep-alive 30s Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: listen listener Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: bind 169.254.169.254:80 Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: server metadata /var/lib/neutron/metadata_proxy Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: http-request add-header X-OVN-Network-ID 8bdf8183-8467-40ac-933d-a37b0bd3539a Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Feb 1 04:32:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:41.702 158365 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'env', 'PROCESS_TAG=haproxy-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8bdf8183-8467-40ac-933d-a37b0bd3539a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Feb 1 04:32:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15570 DF PROTO=TCP SPT=54406 DPT=9102 SEQ=4146690396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D051DB80000000001030307) Feb 1 04:32:42 localhost podman[259882]: Feb 1 04:32:42 localhost podman[259882]: 2026-02-01 09:32:42.083426532 +0000 UTC m=+0.067984729 container create d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:32:42 localhost systemd[1]: Started libpod-conmon-d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16.scope. Feb 1 04:32:42 localhost systemd[1]: tmp-crun.vUjXB2.mount: Deactivated successfully. Feb 1 04:32:42 localhost systemd[1]: Started libcrun container. Feb 1 04:32:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33b86199362622eac4e2ad21e2d60cd14ef6b064ed2d582ad18255c0f15817dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:32:42 localhost podman[259882]: 2026-02-01 09:32:42.042963453 +0000 UTC m=+0.027521680 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 1 04:32:42 localhost podman[259882]: 2026-02-01 09:32:42.151223833 +0000 UTC m=+0.135782050 container init d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 1 04:32:42 localhost podman[259882]: 2026-02-01 09:32:42.161116119 +0000 UTC m=+0.145674346 container start d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:32:42 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[259896]: [NOTICE] (259900) : New worker (259902) forked Feb 1 04:32:42 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[259896]: [NOTICE] (259900) : Loading success. Feb 1 04:32:42 localhost ovn_metadata_agent[158360]: 2026-02-01 09:32:42.217 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:32:43 localhost nova_compute[225632]: 2026-02-01 09:32:43.774 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:32:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33553 DF PROTO=TCP SPT=36090 DPT=9102 SEQ=1830256711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0529380000000001030307) Feb 1 04:32:44 localhost podman[259911]: 2026-02-01 09:32:44.737863906 +0000 UTC m=+0.088361207 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:32:44 localhost podman[259911]: 2026-02-01 09:32:44.777608593 +0000 UTC m=+0.128105894 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:32:44 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:32:46 localhost nova_compute[225632]: 2026-02-01 09:32:46.242 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:32:47 localhost podman[259931]: 2026-02-01 09:32:47.725803811 +0000 UTC m=+0.077017557 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:32:47 localhost podman[259931]: 2026-02-01 09:32:47.765489905 +0000 UTC m=+0.116703621 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible) Feb 1 04:32:47 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:32:48 localhost sshd[259951]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:32:48 localhost systemd-logind[759]: New session 59 of user zuul. Feb 1 04:32:48 localhost systemd[1]: Started Session 59 of User zuul. Feb 1 04:32:48 localhost nova_compute[225632]: 2026-02-01 09:32:48.805 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:49 localhost python3.9[260062]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:32:51 localhost nova_compute[225632]: 2026-02-01 09:32:51.247 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:51 localhost python3.9[260174]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:32:51 localhost network[260191]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:32:51 localhost network[260192]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:32:51 localhost network[260193]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:32:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33554 DF PROTO=TCP SPT=36090 DPT=9102 SEQ=1830256711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0549B80000000001030307) Feb 1 04:32:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:32:53 localhost nova_compute[225632]: 2026-02-01 09:32:53.807 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:53 localhost podman[236886]: time="2026-02-01T09:32:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:32:53 localhost podman[236886]: @ - - [01/Feb/2026:09:32:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:32:54 localhost podman[236886]: @ - - [01/Feb/2026:09:32:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17265 "" "Go-http-client/1.1" Feb 1 04:32:56 localhost nova_compute[225632]: 2026-02-01 09:32:56.251 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:58 localhost nova_compute[225632]: 2026-02-01 09:32:58.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:32:58 localhost python3.9[260425]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 1 04:32:58 localhost nova_compute[225632]: 2026-02-01 09:32:58.838 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:32:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:32:59 localhost podman[260489]: 2026-02-01 09:32:59.349462975 +0000 UTC m=+0.088987787 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:32:59 localhost podman[260489]: 2026-02-01 09:32:59.359034701 +0000 UTC m=+0.098559513 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:32:59 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:32:59 localhost nova_compute[225632]: 2026-02-01 09:32:59.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:32:59 localhost nova_compute[225632]: 2026-02-01 09:32:59.408 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:32:59 localhost nova_compute[225632]: 2026-02-01 09:32:59.408 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:32:59 localhost python3.9[260488]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:33:00 localhost nova_compute[225632]: 2026-02-01 09:33:00.409 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.291 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.434 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.435 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.435 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.436 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.436 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:33:01 localhost openstack_network_exporter[239441]: ERROR 09:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:33:01 localhost openstack_network_exporter[239441]: Feb 1 04:33:01 localhost openstack_network_exporter[239441]: ERROR 09:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:33:01 localhost openstack_network_exporter[239441]: Feb 1 04:33:01 localhost nova_compute[225632]: 2026-02-01 09:33:01.906 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.092 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.093 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.325 225636 WARNING nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.327 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11800MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.328 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.328 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.455 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.456 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.456 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.512 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.939 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.945 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.969 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.971 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:33:02 localhost nova_compute[225632]: 2026-02-01 09:33:02.971 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.520 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.520 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.524 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92909972-b236-4a05-bee5-6918b362477c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.521164', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01bf81f2-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': 'cc2ad568287a2db66f6347373ae9606498a4388ab5595a72ac8e1046b0d96e7c'}]}, 'timestamp': '2026-02-01 09:33:03.525041', '_unique_id': '785200cecbb046b59309af5c95ee768c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.526 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.528 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b58386f-1639-48d7-a539-42b2238073a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.528029', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01c00e60-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '8982b7298aebbe6096acc893b6ea866b4df480b507d5888abcdd4e6a2b4a6241'}]}, 'timestamp': '2026-02-01 09:33:03.528563', '_unique_id': 'ca268eae119545c8a074ac51d90ee419'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.529 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.530 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.574 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 572 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.575 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33a5500d-7089-4c8f-9f45-65860f986cd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 572, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.530920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01c71f8e-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '0030db9ad6ad89efba11e557830eaae0c253484b668f2164c63208b368b562c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.530920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01c756ca-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '9e5557180729ba8a52eb9eef9081551beaad1284ab823c89e3ca4fa891fb7ad2'}]}, 'timestamp': '2026-02-01 09:33:03.576334', '_unique_id': '5b213e59196643479c47f618f7ec342d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.579 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.594 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.595 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c74b6cd-ef52-4b1a-87ec-e48a0df9c0fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.579344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01ca492a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.798843573, 'message_signature': '58cdca5c362286eed0bc241eb8a14cac75694e78f3fca6eb5a65d8129af3b933'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.579344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01ca6018-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.798843573, 'message_signature': '30470ae2ea8e93a2fdf69541689241494dc2239a3af018ac025593b41e382fa3'}]}, 'timestamp': '2026-02-01 09:33:03.596286', '_unique_id': '3b59cf4fcfc14d4e843e9b75d16426e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.599 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.600 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c06a0cb3-1e01-473b-a851-2a67af907548', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.599406', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01caf578-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.798843573, 'message_signature': '05fd4f8dce15c30c67d93c863b5bc7990d3e639ba8cac50e164631da905ea19d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.599406', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01cb14f4-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.798843573, 'message_signature': '2243574bc5c43353a3ad10b3e145653eeb85c7458d6793604b8608e4085d6337'}]}, 'timestamp': '2026-02-01 09:33:03.600847', '_unique_id': 'bd67f2559a724b729b1874788cf1bef8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.603 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.627 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 59340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bd5856e-e1fc-4e40-be78-ffd3cd1c1eef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59340000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:33:03.603648', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '01cf4920-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.846906856, 'message_signature': 'b8d825f2bd1a779e5ada37989a323553ce06871cd010043f77605d8673a04e77'}]}, 'timestamp': '2026-02-01 09:33:03.628424', '_unique_id': '6557f5cc94d843e6937a587a9102af83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.631 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.631 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 530 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0dabeb2-1bef-4725-952a-ce2f0f4b2749', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 530, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.631563', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01cfd9c6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '68aede1b6f705a37e3325ac7f6482e2f9d02a8c4ae624cb153424707c5b2f7dd'}]}, 'timestamp': '2026-02-01 09:33:03.632094', '_unique_id': '1f18c30bc7eb4b469707d243100d307f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.634 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.634 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.634 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 197023361 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 24174444 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58cceb82-6803-44a1-9934-376789ba1ef9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 197023361, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.634769', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01d05a4a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '24eb7f3519833b8d06335c9d3147da991d5d5f5ec202ad516ef31f6369bd6e78'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24174444, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.634769', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01d06bb6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '8e19b7c02fe7c82d41570f321232bdbcb3d6bcb559e811e0ba71f314f2bd191b'}]}, 'timestamp': '2026-02-01 09:33:03.635763', '_unique_id': '939f562de85b4b3c90faaa7704ed7ada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.637 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96b7823c-c090-4526-afbe-e36163d23f5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.638010', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01d0d54c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '38140f32f66a655a1d9da55625414c42a23ba6bef2cd99816110775776a417d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.638010', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01d0e6ae-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': 'f33d2685a36be50432edcebfe766c6af4ac6ed8ee1858925dd86d3f779ad6f20'}]}, 'timestamp': '2026-02-01 09:33:03.638914', '_unique_id': '8372ece7b85a46dc8b71db36917a4ef3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.641 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.641 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.641 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.641 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a75b8540-32b2-4134-b81a-5cc23840f249', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.641373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01d15800-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '383aa47e3ea6d69b98112576dbdeebf9f664a9f061b26f757c7a0b57bee9f259'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.641373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01d16ad4-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '181b13c196ce05f4900f9d7582be0c33b2431cf50fd5d0def91f03c43e148711'}]}, 'timestamp': '2026-02-01 09:33:03.642345', '_unique_id': '974974d1c48045268203e09c34246adb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.644 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 89 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe603af9-48c0-4863-b029-43a42bee179c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 89, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.644723', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01d1daf0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '86c8bd9cabc07da133395d8da66b60e47356bca9d9edb72582cfefb2205deca1'}]}, 'timestamp': '2026-02-01 09:33:03.645263', '_unique_id': '56190a24e20e4697a9021ee8c8a63d79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.647 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.647 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08ce7ff5-1f81-48c8-8d32-9c0e260250f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.647732', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01d25246-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '7ccd9b19fda83f3f17ff5418a3a5a2efd4c41880e2aa9e821b5ea215a3186479'}]}, 'timestamp': '2026-02-01 09:33:03.648253', '_unique_id': '33bd5af7cd1a43f594ca0ada417d1f76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.650 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '367b83c8-251f-4d84-ae1a-9716bad6c51f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.650440', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01d2ba42-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '261ca2d74eb208e653e555ac55f20f1676d016ff5ed91a7bb327ff8d08539ba1'}]}, 'timestamp': '2026-02-01 09:33:03.651002', '_unique_id': '454ba6c84c584862bdbb828d8d444ea2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.653 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.653 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a8e7202-9057-4f09-93b9-a3928dc27b00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.653266', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01d3287e-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': 'b44d25b55e9b292a7f09c3b77d076311eaf17f9f1ef495fd5799ba1b518aef30'}]}, 'timestamp': '2026-02-01 09:33:03.653730', '_unique_id': 'c4881675280f46eab810445710615191'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.654 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.655 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.656 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.656 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b55e4db-97fa-445d-b173-07e839192f7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.656017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01d39476-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.798843573, 'message_signature': '97ef4e3e49fdb5ab874e7fb503e0b204b5fd2782c4d4f07dee5fc101428ad9df'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.656017', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01d3a4c0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.798843573, 'message_signature': 'c7ecff41a651de0313034703d623ab75c64db6ed78e9a671e91e76399109b0a7'}]}, 'timestamp': '2026-02-01 09:33:03.656887', '_unique_id': '9581f7e57ce745f9ba2ed7e4470ccc1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.657 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.659 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.659 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 9312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd29e7bf8-27f8-4c29-a6ef-5afe65dccb0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9312, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.659225', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01d41176-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '4c8232d4476385b4afc56962b7165f8a5a2bddb78e96b81fe5d556fb8ce11410'}]}, 'timestamp': '2026-02-01 09:33:03.659699', '_unique_id': '0c1f2910eade4a5f8317e32918a56930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.660 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.661 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.661 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a8050ac-112d-4ae5-889d-15879a48da2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.661837', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01d478a0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '2831eeb9535fb0f4a0f253b0472e8ea0af46649e397c0baa115435dc10fc21a8'}]}, 'timestamp': '2026-02-01 09:33:03.662336', '_unique_id': '2ec02b78aa8340c08ec88e0e4e24e1b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.663 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.664 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.664 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1227122553 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.665 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 165637656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c1eebb7-31e1-4fca-829d-a30b24992bc2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1227122553, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.664547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01d4e10a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '10c6c0520b9881198389978c27c0cc80651f503214be8a56ac30293eb09c726d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165637656, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.664547', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01d4f410-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '0959f5c3c178696684ee0f3c6876ea154327641dd39a4b39e46a5344d6139bdd'}]}, 'timestamp': '2026-02-01 09:33:03.665483', '_unique_id': '5d23294b924f420cadc6d2c7263e5f54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.666 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.667 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.667 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 52.3984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ee68ead-cae4-4979-bae3-e0ffe526aa33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3984375, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:33:03.667567', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '01d553d8-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.846906856, 'message_signature': '6ffb4a36806c6f11f2215ee5f6332e110e5de9f5a57f824895acf144495fd861'}]}, 'timestamp': '2026-02-01 09:33:03.667846', '_unique_id': 'b955a109f5d94f6287c4b17944d945a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.668 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.669 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.669 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32873c0f-0514-44fa-8f93-fe5fdf04cb4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:33:03.669195', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '01d5937a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.740608966, 'message_signature': '29050729a9492edd0150b958624d284fea434ff331b525fc3f288ca0851dfb61'}]}, 'timestamp': '2026-02-01 09:33:03.669490', '_unique_id': '169546f8ec0a4d57930ff592fc91f44a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.670 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.671 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0d43c78-f9f9-45d7-a39d-2bea392a4b3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:33:03.670935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '01d5d8d0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '740ace19cfc0b9d590ed166119609d99d87d51b841f7f70e1c802fa0c3602a8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:33:03.670935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '01d5e2b2-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10477.750418129, 'message_signature': '1830d7340e1f0aca96d86b0e3e9123d1641a73414fba7dc14b133c87acbcd41f'}]}, 'timestamp': '2026-02-01 09:33:03.671500', '_unique_id': '10d7a1cf8a464b179cc3217206c2fca7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:33:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:33:03.672 12 ERROR oslo_messaging.notify.messaging Feb 1 04:33:03 localhost nova_compute[225632]: 2026-02-01 09:33:03.843 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:03 localhost nova_compute[225632]: 2026-02-01 09:33:03.967 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:03 localhost nova_compute[225632]: 2026-02-01 09:33:03.968 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:03 localhost nova_compute[225632]: 2026-02-01 09:33:03.968 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:33:03 localhost nova_compute[225632]: 2026-02-01 09:33:03.969 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:33:04 localhost python3.9[260667]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:33:04 localhost nova_compute[225632]: 2026-02-01 09:33:04.544 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:33:04 localhost nova_compute[225632]: 2026-02-01 09:33:04.544 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:33:04 localhost nova_compute[225632]: 2026-02-01 09:33:04.544 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:33:04 localhost nova_compute[225632]: 2026-02-01 09:33:04.545 225636 DEBUG nova.objects.instance [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:33:05 localhost python3.9[260777]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:33:05 localhost sshd[260796]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:33:05 localhost nova_compute[225632]: 2026-02-01 09:33:05.591 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:33:05 localhost nova_compute[225632]: 2026-02-01 09:33:05.612 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:33:05 localhost nova_compute[225632]: 2026-02-01 09:33:05.612 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:33:05 localhost nova_compute[225632]: 2026-02-01 09:33:05.613 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:05 localhost python3.9[260890]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:33:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:33:06 localhost podman[260910]: 2026-02-01 09:33:06.282795969 +0000 UTC m=+0.081412133 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 04:33:06 localhost podman[260910]: 2026-02-01 09:33:06.319544193 +0000 UTC m=+0.118160347 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:33:06 localhost nova_compute[225632]: 2026-02-01 09:33:06.325 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:06 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:33:07 localhost python3.9[261021]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20016 DF PROTO=TCP SPT=41462 DPT=9102 SEQ=833891605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0582B20000000001030307) Feb 1 04:33:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:33:07 localhost podman[261077]: 2026-02-01 09:33:07.740646993 +0000 UTC m=+0.095752147 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:33:07 localhost podman[261077]: 2026-02-01 09:33:07.782614008 +0000 UTC m=+0.137719052 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:33:07 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:33:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:33:08 localhost podman[261152]: 2026-02-01 09:33:08.203794973 +0000 UTC m=+0.086927893 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller) Feb 1 04:33:08 localhost podman[261152]: 2026-02-01 09:33:08.27728041 +0000 UTC m=+0.160413320 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:33:08 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:33:08 localhost python3.9[261153]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:33:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20017 DF PROTO=TCP SPT=41462 DPT=9102 SEQ=833891605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0586B80000000001030307) Feb 1 04:33:08 localhost nova_compute[225632]: 2026-02-01 09:33:08.868 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33555 DF PROTO=TCP SPT=36090 DPT=9102 SEQ=1830256711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0589B80000000001030307) Feb 1 04:33:10 localhost python3.9[261288]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:33:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20018 DF PROTO=TCP SPT=41462 DPT=9102 SEQ=833891605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D058EB80000000001030307) Feb 1 04:33:11 localhost nova_compute[225632]: 2026-02-01 09:33:11.368 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7707 DF PROTO=TCP SPT=36000 DPT=9102 SEQ=3839871908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0591B80000000001030307) Feb 1 04:33:12 localhost python3.9[261398]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:33:12 localhost network[261415]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:33:12 localhost network[261416]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:33:12 localhost network[261417]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:33:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:33:13 localhost nova_compute[225632]: 2026-02-01 09:33:13.898 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20019 DF PROTO=TCP SPT=41462 DPT=9102 SEQ=833891605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D059E780000000001030307) Feb 1 04:33:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:33:14 localhost podman[261515]: 2026-02-01 09:33:14.927132936 +0000 UTC m=+0.083611301 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 1 04:33:14 localhost podman[261515]: 2026-02-01 09:33:14.967624295 +0000 UTC m=+0.124102640 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:33:14 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:33:16 localhost nova_compute[225632]: 2026-02-01 09:33:16.406 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:33:18 localhost podman[261578]: 2026-02-01 09:33:18.723184704 +0000 UTC m=+0.084468767 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:33:18 localhost podman[261578]: 2026-02-01 09:33:18.740641653 +0000 UTC m=+0.101925726 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:33:18 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:33:18 localhost nova_compute[225632]: 2026-02-01 09:33:18.941 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:20 localhost python3.9[261690]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:33:21 localhost nova_compute[225632]: 2026-02-01 09:33:21.437 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20020 DF PROTO=TCP SPT=41462 DPT=9102 SEQ=833891605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D05BFB80000000001030307) Feb 1 04:33:23 localhost podman[236886]: time="2026-02-01T09:33:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:33:23 localhost podman[236886]: @ - - [01/Feb/2026:09:33:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:33:23 localhost nova_compute[225632]: 2026-02-01 09:33:23.980 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:24 localhost podman[236886]: @ - - [01/Feb/2026:09:33:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17265 "" "Go-http-client/1.1" Feb 1 04:33:24 localhost python3.9[261802]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 1 04:33:26 localhost python3.9[261912]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Feb 1 04:33:26 localhost nova_compute[225632]: 2026-02-01 09:33:26.496 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:27 localhost python3.9[262022]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:33:27 localhost python3.9[262079]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:28 localhost python3.9[262225]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:29 localhost nova_compute[225632]: 2026-02-01 09:33:29.009 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:29 localhost python3.9[262368]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:33:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:33:29 localhost podman[262442]: 2026-02-01 09:33:29.737627952 +0000 UTC m=+0.088224834 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:33:29 localhost podman[262442]: 2026-02-01 09:33:29.752485 +0000 UTC m=+0.103081852 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:33:29 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:33:29 localhost python3.9[262503]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:33:31 localhost python3.9[262632]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:33:31 localhost openstack_network_exporter[239441]: ERROR 09:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:33:31 localhost openstack_network_exporter[239441]: Feb 1 04:33:31 localhost openstack_network_exporter[239441]: ERROR 09:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:33:31 localhost openstack_network_exporter[239441]: Feb 1 04:33:31 localhost nova_compute[225632]: 2026-02-01 09:33:31.536 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:32 localhost python3.9[262744]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:33:33 localhost python3.9[262855]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:33 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 1 04:33:33 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:33:33 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:33:33 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:33:34 localhost nova_compute[225632]: 2026-02-01 09:33:34.016 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:34 localhost python3.9[262966]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:34 localhost python3.9[263076]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:35 localhost python3.9[263186]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:36 localhost python3.9[263296]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:36 localhost nova_compute[225632]: 2026-02-01 09:33:36.581 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:33:36 localhost podman[263297]: 2026-02-01 09:33:36.722646759 +0000 UTC m=+0.085271972 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:33:36 localhost podman[263297]: 2026-02-01 09:33:36.73239328 +0000 UTC m=+0.095018483 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:33:36 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:33:37 localhost python3.9[263425]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:33:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47577 DF PROTO=TCP SPT=40422 DPT=9102 SEQ=710153072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D05F7E10000000001030307) Feb 1 04:33:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:33:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:33:38 localhost podman[263538]: 2026-02-01 09:33:38.393886086 +0000 UTC m=+0.089077709 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:33:38 localhost podman[263538]: 2026-02-01 09:33:38.406479224 +0000 UTC m=+0.101670837 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:33:38 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:33:38 localhost podman[263539]: 2026-02-01 09:33:38.495731129 +0000 UTC m=+0.186466104 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127) Feb 1 04:33:38 localhost podman[263539]: 2026-02-01 09:33:38.55833925 +0000 UTC m=+0.249074225 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:33:38 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:33:38 localhost python3.9[263537]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:33:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47578 DF PROTO=TCP SPT=40422 DPT=9102 SEQ=710153072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D05FBF80000000001030307) Feb 1 04:33:39 localhost nova_compute[225632]: 2026-02-01 09:33:39.051 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20021 DF PROTO=TCP SPT=41462 DPT=9102 SEQ=833891605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D05FFB90000000001030307) Feb 1 04:33:40 localhost python3.9[263696]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:33:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47579 DF PROTO=TCP SPT=40422 DPT=9102 SEQ=710153072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0603F80000000001030307) Feb 1 04:33:41 localhost python3.9[263808]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 1 04:33:41 localhost nova_compute[225632]: 2026-02-01 09:33:41.624 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33556 DF PROTO=TCP SPT=36090 DPT=9102 SEQ=1830256711 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0607B80000000001030307) Feb 1 04:33:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:33:41.694 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:33:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:33:41.695 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:33:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:33:41.696 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:33:41 localhost python3.9[263918]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Feb 1 04:33:42 localhost python3.9[264028]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:33:43 localhost python3.9[264085]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:43 localhost python3.9[264195]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:44 localhost nova_compute[225632]: 2026-02-01 09:33:44.076 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47580 DF PROTO=TCP SPT=40422 DPT=9102 SEQ=710153072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0613B80000000001030307) Feb 1 04:33:44 localhost python3.9[264305]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 1 04:33:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:33:45 localhost podman[264308]: 2026-02-01 09:33:45.741209474 +0000 UTC m=+0.098315299 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, distribution-scope=public, architecture=x86_64, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:33:45 localhost podman[264308]: 2026-02-01 09:33:45.757345305 +0000 UTC m=+0.114451140 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter) Feb 1 04:33:45 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:33:46 localhost nova_compute[225632]: 2026-02-01 09:33:46.667 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:49 localhost python3.9[264434]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 1 04:33:49 localhost nova_compute[225632]: 2026-02-01 09:33:49.110 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:33:49 localhost systemd[1]: tmp-crun.Cqp6Nv.mount: Deactivated successfully. Feb 1 04:33:49 localhost podman[264456]: 2026-02-01 09:33:49.734971462 +0000 UTC m=+0.091477392 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute) Feb 1 04:33:49 localhost podman[264456]: 2026-02-01 09:33:49.750545966 +0000 UTC m=+0.107051956 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 1 04:33:49 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:33:50 localhost python3.9[264565]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:33:51 localhost nova_compute[225632]: 2026-02-01 09:33:51.669 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:51 localhost python3.9[264675]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:33:51 localhost systemd[1]: Reloading. Feb 1 04:33:52 localhost systemd-rc-local-generator[264702]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:33:52 localhost systemd-sysv-generator[264707]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:33:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47581 DF PROTO=TCP SPT=40422 DPT=9102 SEQ=710153072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0633B80000000001030307) Feb 1 04:33:52 localhost python3.9[264819]: ansible-ansible.builtin.service_facts Invoked Feb 1 04:33:53 localhost network[264836]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 1 04:33:53 localhost network[264837]: 'network-scripts' will be removed from distribution in near future. Feb 1 04:33:53 localhost network[264838]: It is advised to switch to 'NetworkManager' instead for network management. Feb 1 04:33:53 localhost podman[236886]: time="2026-02-01T09:33:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:33:53 localhost podman[236886]: @ - - [01/Feb/2026:09:33:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:33:54 localhost podman[236886]: @ - - [01/Feb/2026:09:33:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17274 "" "Go-http-client/1.1" Feb 1 04:33:54 localhost nova_compute[225632]: 2026-02-01 09:33:54.154 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:33:54 localhost sshd[264896]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:33:56 localhost nova_compute[225632]: 2026-02-01 09:33:56.670 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:58 localhost nova_compute[225632]: 2026-02-01 09:33:58.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:58 localhost nova_compute[225632]: 2026-02-01 09:33:58.407 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 04:33:58 localhost python3.9[265072]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:33:59 localhost nova_compute[225632]: 2026-02-01 09:33:59.190 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:33:59 localhost nova_compute[225632]: 2026-02-01 09:33:59.427 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:33:59 localhost nova_compute[225632]: 2026-02-01 09:33:59.428 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:33:59 localhost python3.9[265183]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:34:00 localhost nova_compute[225632]: 2026-02-01 09:34:00.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:00 localhost nova_compute[225632]: 2026-02-01 09:34:00.408 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:00 localhost nova_compute[225632]: 2026-02-01 09:34:00.408 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 04:34:00 localhost nova_compute[225632]: 2026-02-01 09:34:00.431 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 04:34:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:34:00 localhost podman[265185]: 2026-02-01 09:34:00.739326374 +0000 UTC m=+0.100301000 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:34:00 localhost podman[265185]: 2026-02-01 09:34:00.745139221 +0000 UTC m=+0.106113867 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:34:00 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.430 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.432 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.465 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.466 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.467 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.468 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.468 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:34:01 localhost openstack_network_exporter[239441]: ERROR 09:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:34:01 localhost openstack_network_exporter[239441]: Feb 1 04:34:01 localhost openstack_network_exporter[239441]: ERROR 09:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:34:01 localhost openstack_network_exporter[239441]: Feb 1 04:34:01 localhost python3.9[265318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.674 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:01 localhost nova_compute[225632]: 2026-02-01 09:34:01.964 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.046 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.047 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.283 225636 WARNING nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.284 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11897MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.285 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.285 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:34:02 localhost python3.9[265451]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.495 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.497 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.497 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.571 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.677 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.678 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.695 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.737 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_F16C,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_MMX,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_AMD_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE42,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AESNI,HW_CPU_X86_AVX2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:34:02 localhost nova_compute[225632]: 2026-02-01 09:34:02.797 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:34:03 localhost python3.9[265582]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:34:03 localhost nova_compute[225632]: 2026-02-01 09:34:03.284 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:34:03 localhost nova_compute[225632]: 2026-02-01 09:34:03.290 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:34:03 localhost nova_compute[225632]: 2026-02-01 09:34:03.311 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:34:03 localhost nova_compute[225632]: 2026-02-01 09:34:03.313 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:34:03 localhost nova_compute[225632]: 2026-02-01 09:34:03.314 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.219 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.288 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.310 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.310 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.310 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.546 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.547 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.547 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.548 225636 DEBUG nova.objects.instance [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.914 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.931 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.932 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.932 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.933 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.933 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:04 localhost nova_compute[225632]: 2026-02-01 09:34:04.933 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:05 localhost nova_compute[225632]: 2026-02-01 09:34:05.063 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:05 localhost python3.9[265695]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:34:06 localhost nova_compute[225632]: 2026-02-01 09:34:06.679 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:34:07 localhost podman[265806]: 2026-02-01 09:34:07.023521799 +0000 UTC m=+0.083662624 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:34:07 localhost podman[265806]: 2026-02-01 09:34:07.032284846 +0000 UTC m=+0.092425691 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:34:07 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:34:07 localhost python3.9[265807]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:34:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51486 DF PROTO=TCP SPT=57666 DPT=9102 SEQ=3004973848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D066D120000000001030307) Feb 1 04:34:08 localhost python3.9[265936]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:34:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:34:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:34:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51487 DF PROTO=TCP SPT=57666 DPT=9102 SEQ=3004973848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0671380000000001030307) Feb 1 04:34:08 localhost podman[265987]: 2026-02-01 09:34:08.721253049 +0000 UTC m=+0.079986103 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:34:08 localhost podman[265987]: 2026-02-01 09:34:08.759438339 +0000 UTC m=+0.118171403 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:34:08 localhost podman[265991]: 2026-02-01 09:34:08.776169908 +0000 UTC m=+0.132304783 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:34:08 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:34:08 localhost podman[265991]: 2026-02-01 09:34:08.86044749 +0000 UTC m=+0.216582405 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:34:08 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:34:09 localhost python3.9[266094]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:09 localhost nova_compute[225632]: 2026-02-01 09:34:09.220 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47582 DF PROTO=TCP SPT=40422 DPT=9102 SEQ=710153072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0673B80000000001030307) Feb 1 04:34:09 localhost python3.9[266204]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:10 localhost python3.9[266314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51488 DF PROTO=TCP SPT=57666 DPT=9102 SEQ=3004973848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0679380000000001030307) Feb 1 04:34:11 localhost python3.9[266424]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:11 localhost nova_compute[225632]: 2026-02-01 09:34:11.682 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:11 localhost python3.9[266534]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20022 DF PROTO=TCP SPT=41462 DPT=9102 SEQ=833891605 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D067DB90000000001030307) Feb 1 04:34:13 localhost python3.9[266644]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:13 localhost python3.9[266754]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:14 localhost nova_compute[225632]: 2026-02-01 09:34:14.260 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:14 localhost python3.9[266864]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51489 DF PROTO=TCP SPT=57666 DPT=9102 SEQ=3004973848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0688F80000000001030307) Feb 1 04:34:15 localhost python3.9[266974]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:15 localhost python3.9[267084]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:34:16 localhost nova_compute[225632]: 2026-02-01 09:34:16.686 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:16 localhost podman[267173]: 2026-02-01 09:34:16.716184677 +0000 UTC m=+0.077345292 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, container_name=openstack_network_exporter) Feb 1 04:34:16 localhost podman[267173]: 2026-02-01 09:34:16.733544156 +0000 UTC m=+0.094704741 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1769056855, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible) Feb 1 04:34:16 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:34:17 localhost python3.9[267214]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:17 localhost python3.9[267324]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:18 localhost python3.9[267434]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:18 localhost python3.9[267544]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:19 localhost nova_compute[225632]: 2026-02-01 09:34:19.289 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:34:20 localhost podman[267655]: 2026-02-01 09:34:20.182401547 +0000 UTC m=+0.087540241 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:34:20 localhost podman[267655]: 2026-02-01 09:34:20.193412542 +0000 UTC m=+0.098551186 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:34:20 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:34:20 localhost python3.9[267654]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:20 localhost python3.9[267783]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:34:21 localhost nova_compute[225632]: 2026-02-01 09:34:21.690 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:21 localhost python3.9[267893]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:22 localhost python3.9[268003]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 1 04:34:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51490 DF PROTO=TCP SPT=57666 DPT=9102 SEQ=3004973848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D06A9B90000000001030307) Feb 1 04:34:23 localhost python3.9[268113]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 1 04:34:23 localhost systemd[1]: Reloading. Feb 1 04:34:23 localhost podman[236886]: time="2026-02-01T09:34:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:34:23 localhost podman[236886]: @ - - [01/Feb/2026:09:34:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:34:24 localhost podman[236886]: @ - - [01/Feb/2026:09:34:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17282 "" "Go-http-client/1.1" Feb 1 04:34:24 localhost systemd-rc-local-generator[268139]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:34:24 localhost systemd-sysv-generator[268143]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:34:24 localhost nova_compute[225632]: 2026-02-01 09:34:24.294 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:25 localhost python3.9[268259]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:25 localhost python3.9[268370]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:26 localhost python3.9[268481]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:26 localhost nova_compute[225632]: 2026-02-01 09:34:26.695 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:27 localhost python3.9[268592]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:27 localhost python3.9[268703]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:28 localhost python3.9[268814]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:29 localhost python3.9[268925]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:29 localhost nova_compute[225632]: 2026-02-01 09:34:29.297 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:30 localhost python3.9[269072]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:34:31 localhost systemd[1]: tmp-crun.pVU6i9.mount: Deactivated successfully. Feb 1 04:34:31 localhost podman[269084]: 2026-02-01 09:34:31.026422917 +0000 UTC m=+0.109190461 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:34:31 localhost podman[269084]: 2026-02-01 09:34:31.038365799 +0000 UTC m=+0.121133333 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:34:31 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:34:31 localhost openstack_network_exporter[239441]: ERROR 09:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:34:31 localhost openstack_network_exporter[239441]: Feb 1 04:34:31 localhost openstack_network_exporter[239441]: ERROR 09:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:34:31 localhost openstack_network_exporter[239441]: Feb 1 04:34:31 localhost nova_compute[225632]: 2026-02-01 09:34:31.697 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:34 localhost nova_compute[225632]: 2026-02-01 09:34:34.303 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:35 localhost python3.9[269309]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:36 localhost python3.9[269419]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:36 localhost nova_compute[225632]: 2026-02-01 09:34:36.702 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:36 localhost python3.9[269529]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:34:37 localhost podman[269585]: 2026-02-01 09:34:37.385408844 +0000 UTC m=+0.099864397 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:34:37 localhost podman[269585]: 2026-02-01 09:34:37.418944084 +0000 UTC m=+0.133399617 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Feb 1 04:34:37 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:34:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10559 DF PROTO=TCP SPT=33990 DPT=9102 SEQ=3149604942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D06E2410000000001030307) Feb 1 04:34:37 localhost python3.9[269657]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:38 localhost python3.9[269767]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10560 DF PROTO=TCP SPT=33990 DPT=9102 SEQ=3149604942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D06E6380000000001030307) Feb 1 04:34:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:34:38 localhost systemd[1]: tmp-crun.PBJsbm.mount: Deactivated successfully. Feb 1 04:34:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:34:38 localhost podman[269878]: 2026-02-01 09:34:38.974954095 +0000 UTC m=+0.100486296 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:34:38 localhost podman[269878]: 2026-02-01 09:34:38.987565969 +0000 UTC m=+0.113098170 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:34:39 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:34:39 localhost python3.9[269877]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:39 localhost systemd[1]: tmp-crun.OI8Avd.mount: Deactivated successfully. Feb 1 04:34:39 localhost podman[269901]: 2026-02-01 09:34:39.080089031 +0000 UTC m=+0.091440131 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:34:39 localhost podman[269901]: 2026-02-01 09:34:39.156176844 +0000 UTC m=+0.167527954 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:34:39 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:34:39 localhost nova_compute[225632]: 2026-02-01 09:34:39.303 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51491 DF PROTO=TCP SPT=57666 DPT=9102 SEQ=3004973848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D06E9B80000000001030307) Feb 1 04:34:39 localhost python3.9[270034]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:40 localhost python3.9[270144]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10561 DF PROTO=TCP SPT=33990 DPT=9102 SEQ=3149604942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D06EE390000000001030307) Feb 1 04:34:40 localhost python3.9[270254]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47583 DF PROTO=TCP SPT=40422 DPT=9102 SEQ=710153072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D06F1B90000000001030307) Feb 1 04:34:41 localhost python3.9[270364]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:34:41.696 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:34:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:34:41.697 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:34:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:34:41.698 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:34:41 localhost nova_compute[225632]: 2026-02-01 09:34:41.706 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:44 localhost nova_compute[225632]: 2026-02-01 09:34:44.306 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10562 DF PROTO=TCP SPT=33990 DPT=9102 SEQ=3149604942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D06FDF80000000001030307) Feb 1 04:34:46 localhost nova_compute[225632]: 2026-02-01 09:34:46.711 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:34:47 localhost systemd[1]: tmp-crun.KXvT7v.mount: Deactivated successfully. Feb 1 04:34:47 localhost podman[270420]: 2026-02-01 09:34:47.732242159 +0000 UTC m=+0.095287248 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.7, architecture=x86_64) Feb 1 04:34:47 localhost podman[270420]: 2026-02-01 09:34:47.747360968 +0000 UTC m=+0.110406087 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1769056855, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:34:47 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:34:48 localhost python3.9[270495]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Feb 1 04:34:49 localhost nova_compute[225632]: 2026-02-01 09:34:49.309 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:50 localhost sshd[270514]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:34:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:34:50 localhost systemd-logind[759]: New session 60 of user zuul. Feb 1 04:34:50 localhost systemd[1]: Started Session 60 of User zuul. Feb 1 04:34:50 localhost podman[270516]: 2026-02-01 09:34:50.560898868 +0000 UTC m=+0.097586518 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:34:50 localhost podman[270516]: 2026-02-01 09:34:50.60044075 +0000 UTC m=+0.137128360 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 1 04:34:50 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:34:50 localhost systemd[1]: session-60.scope: Deactivated successfully. Feb 1 04:34:50 localhost systemd-logind[759]: Session 60 logged out. Waiting for processes to exit. Feb 1 04:34:50 localhost systemd-logind[759]: Removed session 60. Feb 1 04:34:51 localhost python3.9[270644]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:34:51 localhost nova_compute[225632]: 2026-02-01 09:34:51.715 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:51 localhost python3.9[270730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938490.898867-2336-230650771456739/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:52 localhost nova_compute[225632]: 2026-02-01 09:34:52.286 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:52 localhost nova_compute[225632]: 2026-02-01 09:34:52.307 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Triggering sync for uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 1 04:34:52 localhost nova_compute[225632]: 2026-02-01 09:34:52.308 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:34:52 localhost nova_compute[225632]: 2026-02-01 09:34:52.309 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:34:52 localhost nova_compute[225632]: 2026-02-01 09:34:52.345 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:34:52 localhost python3.9[270838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:34:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10563 DF PROTO=TCP SPT=33990 DPT=9102 SEQ=3149604942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D071DB80000000001030307) Feb 1 04:34:53 localhost python3.9[270893]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:53 localhost python3.9[271001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:34:53 localhost podman[236886]: time="2026-02-01T09:34:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:34:53 localhost podman[236886]: @ - - [01/Feb/2026:09:34:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:34:53 localhost podman[236886]: @ - - [01/Feb/2026:09:34:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17273 "" "Go-http-client/1.1" Feb 1 04:34:54 localhost python3.9[271087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938493.1563368-2336-236073705639607/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:54 localhost nova_compute[225632]: 2026-02-01 09:34:54.359 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:54 localhost python3.9[271195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:34:55 localhost python3.9[271281]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938494.3839722-2336-74433358012362/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=81fac5bfb76f59376b169cd323b581eaa2259497 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:55 localhost python3.9[271389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:34:56 localhost python3.9[271475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938495.4991872-2336-195599488468434/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:56 localhost nova_compute[225632]: 2026-02-01 09:34:56.719 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:34:57 localhost python3.9[271583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:34:57 localhost python3.9[271669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938496.6191041-2336-14598032264898/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:34:59 localhost nova_compute[225632]: 2026-02-01 09:34:59.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:34:59 localhost nova_compute[225632]: 2026-02-01 09:34:59.408 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:34:59 localhost nova_compute[225632]: 2026-02-01 09:34:59.409 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:00 localhost nova_compute[225632]: 2026-02-01 09:35:00.408 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:00 localhost python3.9[271779]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:35:01 localhost podman[271890]: 2026-02-01 09:35:01.299417499 +0000 UTC m=+0.084817559 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:35:01 localhost podman[271890]: 2026-02-01 09:35:01.311473565 +0000 UTC m=+0.096873655 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:35:01 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:35:01 localhost python3.9[271889]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:35:01 localhost openstack_network_exporter[239441]: ERROR 09:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:35:01 localhost openstack_network_exporter[239441]: Feb 1 04:35:01 localhost openstack_network_exporter[239441]: ERROR 09:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:35:01 localhost openstack_network_exporter[239441]: Feb 1 04:35:01 localhost nova_compute[225632]: 2026-02-01 09:35:01.721 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:02 localhost python3.9[272022]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.406 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.407 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:03 localhost python3.9[272134]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.509 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.510 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.510 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.511 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:35:03 localhost nova_compute[225632]: 2026-02-01 09:35:03.511 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.523 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.523 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.546 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 60630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd036d06e-24b5-43d4-967b-9b19f9ff567d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60630000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:35:03.523899', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '49497c58-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.765786013, 'message_signature': 'e77a90c11e5bd954fa10f2495ef7e02b0de40de92c510eaf767803f5a175f9d3'}]}, 'timestamp': '2026-02-01 09:35:03.547378', '_unique_id': '231405f36e7a43d7bba4a4437d605877'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.548 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.549 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.582 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.582 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1bffd2e-350c-4a8c-9b20-d1b20da1b829', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.549084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '494ed70c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '39cf004e86b52ac5ce42118598e5648e1b03ccd60957f64d9302191759c3507c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.549084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '494ee01c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '4d26698ea5b550e3582f310a86ec1c84646a99a263b886ccd724d51d20491d4f'}]}, 'timestamp': '2026-02-01 09:35:03.582626', '_unique_id': '10d8dd672b0c45808b1d03247b66461c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.583 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 572 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb403e83-a1e8-435d-8f65-45e61273f00a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 572, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.583851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '494f1866-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '1c6c3ff0e53476168a1ff276b13870d787f80dbfbfed7b32565dfa08ca773c8d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.583851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '494f21c6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': 'd262644eb328676a6c6e97f0fa86168e3c9abfaef1fab0649f060ba95e397ab9'}]}, 'timestamp': '2026-02-01 09:35:03.584296', '_unique_id': 'e1a6a0657edd450682f6940dd66d1f6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 9312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c221a384-67b8-421f-b3d9-2526beb9673b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9312, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.585321', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '494f9b56-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': '24af840387589bb40450e55cebbf810208f9330488aefc010192c083d92c28ef'}]}, 'timestamp': '2026-02-01 09:35:03.587423', '_unique_id': 'c3d301f43fcd4f1684935d0780c5c97a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.588 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b8c68c1-a643-4a9a-b052-c30aa817437c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.588474', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '494fcce8-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': '1c8e4d959e6597b6392d7bbf8052a1cea87a33833403f8df49005a446dd6bf05'}]}, 'timestamp': '2026-02-01 09:35:03.588762', '_unique_id': 'ba39f3df2091411cbfce9dc80f1404f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 89 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83315680-72f5-4f2d-ad84-8f1801011eeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 89, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.589741', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '494ffe66-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': '95c8eca71c4130d38e4995d56fabaff8109b3a39c3e50342ea8f9c38f53ac3d0'}]}, 'timestamp': '2026-02-01 09:35:03.590057', '_unique_id': 'fa4e163b3a224ebcabe8ef9045fa2326'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.590 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c62752ff-faa0-44a3-ac11-d514f37f4b2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.591044', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '4950316a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': 'ce405b6617b5834730a6c1d028e82ded2779eb1a7f68b6722726006369bc183a'}]}, 'timestamp': '2026-02-01 09:35:03.591263', '_unique_id': 'e7d604ccaaf843a2a7e6311b55c0b527'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.592 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.602 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.602 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07d3b702-eaf4-44d0-8e0e-b238744ef173', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.592433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4951e8de-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.811838984, 'message_signature': 'bb8abb2501e874623f7f911fd8eb924fdc1c23255aba6bb21ac62d55a3602c2b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.592433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4951f112-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.811838984, 'message_signature': '36351bbdcc2e561bbe0a63dbf758ca2ce364e5bb711f36e80a3955c7912f5564'}]}, 'timestamp': '2026-02-01 09:35:03.602709', '_unique_id': '673817f9fa0645379b80d45faf0eda39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1227122553 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 165637656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1b5c2ab-5e14-48a7-89ce-e4781f37f974', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1227122553, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.603766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '49522240-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': 'f80276cb9a42848a94a0991f95023d9bc07aede0ff9e8dac5fd91943b3f9b74a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165637656, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.603766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '49522aa6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '8dd17d37030d0805f28e9eb2b1efd9300a4004d1b72c83eccdcbb2d8980a106a'}]}, 'timestamp': '2026-02-01 09:35:03.604185', '_unique_id': '0a1ada1700a1470fa23b2e4c1fcd315b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.605 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.605 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.605 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f159e47e-3ab8-46c8-bb4c-a13262f6e536', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.605223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '49525b48-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.811838984, 'message_signature': 'c14db5c54002d7e963f45d6c39620d4ddf7a959abdea742579fd155c70403374'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.605223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '495262c8-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.811838984, 'message_signature': '9383a7616920b7ad485cfcdccf2758199cb04a8548c8540ae13457621fd28bfa'}]}, 'timestamp': '2026-02-01 09:35:03.605619', '_unique_id': 'ceb854fe092c43e8abfa3bfd466ec1ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.606 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71af0187-829b-4aab-863d-5f4a28c3e2ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.606604', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '4952913a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': '46ed46219a08204ade7261fd405d611e00db1dda450a03eb8f3acb9bf60e11df'}]}, 'timestamp': '2026-02-01 09:35:03.606822', '_unique_id': '768afc1ed52d4648b1461bb41cdad63b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.607 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 52.3984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ba83278-4327-446a-b849-b1ff0a1a94ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3984375, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:35:03.607794', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4952bf7a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.765786013, 'message_signature': '05b29e9d75288fd14dd9170c60ae8b8235be2445d2e373536f269108408322c6'}]}, 'timestamp': '2026-02-01 09:35:03.608014', '_unique_id': '535f11a9699043c290519700937410b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.608 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '913af5ad-63b4-4890-b468-8ce9bcd5b731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.609040', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '4952f062-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': '15bb23a1a1573fd1c229d635b7056b896e809e9cc1c39a97921e67c836cf2267'}]}, 'timestamp': '2026-02-01 09:35:03.609259', '_unique_id': '64b2fcbe82ea4232876380763ed13e94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.609 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.610 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 197023361 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 24174444 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbcf013a-2ac3-47d4-ade8-dc510b417280', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 197023361, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.610233', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '49531eca-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '6f21034440f6eee9e9bfacfc6b1b5956553f67bbef0fa59f9ca6b13a8ab21d43'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24174444, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.610233', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4953267c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': 'f790961dc1e279cc343a0a9421fc583737c484c7fd7c6c6795dff46fc2619181'}]}, 'timestamp': '2026-02-01 09:35:03.610631', '_unique_id': '8fbd2ce6e39a4ca18b452457720f9b24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.611 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78398320-b691-448d-b3a8-03f46f2d80c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.611619', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '495354f8-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '76b692fb4f38a305911cd0b29ab3afb5ed116e98c77e7c2658aa1339b8763883'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.611619', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '49535c96-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '886bb6d87561fdb9b69df89f631c62a00436964061c22541597ecdb809b0680f'}]}, 'timestamp': '2026-02-01 09:35:03.612063', '_unique_id': '47d4a32f4b7a4bedb3e31cd3cd986bde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f7fc435-07d4-4733-bbc9-060778aa3bc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.613087', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '49538ebe-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': 'd305242343873db38a80915326310e8edf7f0b3c6a7e07a545a8bfc404c20b7f'}]}, 'timestamp': '2026-02-01 09:35:03.613320', '_unique_id': '29e63370d3424b94b10e114cdc2e705f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.614 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a11c733-76bc-47df-a86b-045cdb2cce89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.614359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4953c00a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.811838984, 'message_signature': '49040423e3a1652de03c8e51e2dc881fc947d939d2226eea25da18da6e73025b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.614359', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4953c7b2-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.811838984, 'message_signature': 'cfb7b790b760a897bb947aedf6984b0c9c6d978e046b932a9ad354721ac3f7c1'}]}, 'timestamp': '2026-02-01 09:35:03.614777', '_unique_id': '16ddbf71919a4d1b944d51e0ae4091bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.615 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d6c0b25-858e-4c5f-beeb-78aaf8d04de2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.615772', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '4953f75a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': '509c34dcffb0ee782dfbe1d780f7ef08ec71645e36e53a74ee70348d00c20ad8'}]}, 'timestamp': '2026-02-01 09:35:03.615994', '_unique_id': 'e9c3377b345445cfa009d219e74e297f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59bd24e8-a0a7-409f-84bc-ecf68703395b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.617092', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '49542af4-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': 'c3dbc638f554352fcf6d60a293a3927ee59d05fea194f67c16ecf30f2972b3f6'}]}, 'timestamp': '2026-02-01 09:35:03.617312', '_unique_id': 'f3406f79eb714ccc93cbe662839b3ea6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.618 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9dc1034-e280-48bc-a144-cec0c4afe4ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:35:03.618433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '49545f38-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '924e1cb95b5b73fbe9be3e217852061861ab71b851f67fae2cd8b11cdc6dd6ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:35:03.618433', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '495466e0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.768495516, 'message_signature': '89892e85d8943bea989f63b3e32f66ed9833bffa31f7659a6edb3ce258b82d9a'}]}, 'timestamp': '2026-02-01 09:35:03.618832', '_unique_id': '384d6bc522744c7ab234770932879062'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.619 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71aaea4c-68f0-443c-afa3-b2e8986f7caf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:35:03.619894', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '495498e0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10597.804730567, 'message_signature': '8c6c07ac8dd41fa0fcd315e6040c0e6a3b4042c2d3e56b24843fb6f5d19299b3'}]}, 'timestamp': '2026-02-01 09:35:03.620127', '_unique_id': 'e4294f378b87412fbbcd7f4348c6e3d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:35:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:35:03.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.007 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.087 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.088 225636 DEBUG nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.315 225636 WARNING nova.virt.libvirt.driver [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.317 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11903MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.317 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.318 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.387 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.388 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.388 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.426 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.441 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:04 localhost python3.9[272264]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.884 225636 DEBUG oslo_concurrency.processutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.892 225636 DEBUG nova.compute.provider_tree [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.907 225636 DEBUG nova.scheduler.client.report [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.910 225636 DEBUG nova.compute.resource_tracker [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:35:04 localhost nova_compute[225632]: 2026-02-01 09:35:04.911 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:35:05 localhost python3.9[272396]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:35:05 localhost python3.9[272451]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:35:05 localhost nova_compute[225632]: 2026-02-01 09:35:05.912 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:05 localhost nova_compute[225632]: 2026-02-01 09:35:05.913 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:05 localhost nova_compute[225632]: 2026-02-01 09:35:05.913 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:35:05 localhost nova_compute[225632]: 2026-02-01 09:35:05.913 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:35:06 localhost python3.9[272559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 1 04:35:06 localhost nova_compute[225632]: 2026-02-01 09:35:06.584 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:35:06 localhost nova_compute[225632]: 2026-02-01 09:35:06.584 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:35:06 localhost nova_compute[225632]: 2026-02-01 09:35:06.585 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:35:06 localhost nova_compute[225632]: 2026-02-01 09:35:06.585 225636 DEBUG nova.objects.instance [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:35:06 localhost nova_compute[225632]: 2026-02-01 09:35:06.757 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:06 localhost nova_compute[225632]: 2026-02-01 09:35:06.995 225636 DEBUG nova.network.neutron [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:35:07 localhost python3.9[272614]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 1 04:35:07 localhost nova_compute[225632]: 2026-02-01 09:35:07.014 225636 DEBUG oslo_concurrency.lockutils [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:35:07 localhost nova_compute[225632]: 2026-02-01 09:35:07.015 225636 DEBUG nova.compute.manager [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:35:07 localhost nova_compute[225632]: 2026-02-01 09:35:07.016 225636 DEBUG oslo_service.periodic_task [None req-2cc1f7b4-7a8b-4379-9c40-3e7b47fe06bf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29092 DF PROTO=TCP SPT=46240 DPT=9102 SEQ=3484246408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0757720000000001030307) Feb 1 04:35:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:35:07 localhost podman[272670]: 2026-02-01 09:35:07.722443444 +0000 UTC m=+0.081638163 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:35:07 localhost podman[272670]: 2026-02-01 09:35:07.729471357 +0000 UTC m=+0.088666046 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 1 04:35:07 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:35:08 localhost python3.9[272740]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Feb 1 04:35:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29093 DF PROTO=TCP SPT=46240 DPT=9102 SEQ=3484246408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D075B790000000001030307) Feb 1 04:35:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:35:09 localhost podman[272851]: 2026-02-01 09:35:09.216889463 +0000 UTC m=+0.091814911 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:35:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:35:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10564 DF PROTO=TCP SPT=33990 DPT=9102 SEQ=3149604942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D075DB80000000001030307) Feb 1 04:35:09 localhost podman[272851]: 2026-02-01 09:35:09.262378477 +0000 UTC m=+0.137303935 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:35:09 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:35:09 localhost podman[272874]: 2026-02-01 09:35:09.32103273 +0000 UTC m=+0.083576962 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2) Feb 1 04:35:09 localhost python3.9[272850]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:35:09 localhost podman[272874]: 2026-02-01 09:35:09.417503382 +0000 UTC m=+0.180047584 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:35:09 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:35:09 localhost nova_compute[225632]: 2026-02-01 09:35:09.443 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:10 localhost python3[273007]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:35:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29094 DF PROTO=TCP SPT=46240 DPT=9102 SEQ=3484246408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0763790000000001030307) Feb 1 04:35:10 localhost python3[273007]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",#012 "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:31:38.534497001Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1214548351,#012 "VirtualSize": 1214548351,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",#012 "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 1 04:35:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51492 DF PROTO=TCP SPT=57666 DPT=9102 SEQ=3004973848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0767B80000000001030307) Feb 1 04:35:11 localhost nova_compute[225632]: 2026-02-01 09:35:11.791 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:11 localhost python3.9[273180]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:35:13 localhost python3.9[273292]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Feb 1 04:35:13 localhost python3.9[273402]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 1 04:35:14 localhost nova_compute[225632]: 2026-02-01 09:35:14.491 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29095 DF PROTO=TCP SPT=46240 DPT=9102 SEQ=3484246408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0773380000000001030307) Feb 1 04:35:15 localhost python3[273512]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Feb 1 04:35:15 localhost python3[273512]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",#012 "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-01-30T06:31:38.534497001Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1214548351,#012 "VirtualSize": 1214548351,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",#012 "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",#012 "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",#012 "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",#012 "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260127",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-01-28T05:56:51.126388624Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:51.126459235Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260127\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-28T05:56:53.726938221Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-01-30T06:10:18.890429494Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890534417Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890553228Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890570688Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890616649Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:18.890659121Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:19.232761948Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-01-30T06:10:52.670543613Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 1 04:35:16 localhost python3.9[273682]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:35:16 localhost nova_compute[225632]: 2026-02-01 09:35:16.794 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:17 localhost python3.9[273794]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:35:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:35:18 localhost podman[273857]: 2026-02-01 09:35:18.727693723 +0000 UTC m=+0.076293900 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:35:18 localhost podman[273857]: 2026-02-01 09:35:18.770508676 +0000 UTC m=+0.119108883 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 04:35:18 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:35:19 localhost python3.9[273925]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769938517.517902-3026-52507674427566/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:35:19 localhost nova_compute[225632]: 2026-02-01 09:35:19.528 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:19 localhost python3.9[273980]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:35:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:35:20 localhost podman[274000]: 2026-02-01 09:35:20.731959192 +0000 UTC m=+0.093627658 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:35:20 localhost podman[274000]: 2026-02-01 09:35:20.745470613 +0000 UTC m=+0.107139079 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Feb 1 04:35:20 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:35:21 localhost nova_compute[225632]: 2026-02-01 09:35:21.795 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:22 localhost python3.9[274108]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:35:23 localhost python3.9[274216]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:35:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29096 DF PROTO=TCP SPT=46240 DPT=9102 SEQ=3484246408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0793B90000000001030307) Feb 1 04:35:23 localhost python3.9[274324]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 1 04:35:23 localhost podman[236886]: time="2026-02-01T09:35:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:35:23 localhost podman[236886]: @ - - [01/Feb/2026:09:35:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:35:23 localhost podman[236886]: @ - - [01/Feb/2026:09:35:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17270 "" "Go-http-client/1.1" Feb 1 04:35:24 localhost nova_compute[225632]: 2026-02-01 09:35:24.572 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:24 localhost python3.9[274434]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 1 04:35:25 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 103.0 (343 of 333 items), suggesting rotation. Feb 1 04:35:25 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:35:25 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:35:25 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:35:25 localhost python3.9[274567]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 1 04:35:25 localhost systemd[1]: Stopping nova_compute container... Feb 1 04:35:26 localhost nova_compute[225632]: 2026-02-01 09:35:26.043 225636 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Feb 1 04:35:26 localhost nova_compute[225632]: 2026-02-01 09:35:26.832 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:29 localhost nova_compute[225632]: 2026-02-01 09:35:29.608 225636 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:31 localhost nova_compute[225632]: 2026-02-01 09:35:31.414 225636 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 1 04:35:31 localhost nova_compute[225632]: 2026-02-01 09:35:31.416 225636 DEBUG oslo_concurrency.lockutils [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:35:31 localhost nova_compute[225632]: 2026-02-01 09:35:31.416 225636 DEBUG oslo_concurrency.lockutils [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:35:31 localhost nova_compute[225632]: 2026-02-01 09:35:31.417 225636 DEBUG oslo_concurrency.lockutils [None req-6a350d75-fe2f-45d1-8f23-e2db767dacce - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:35:31 localhost openstack_network_exporter[239441]: ERROR 09:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:35:31 localhost openstack_network_exporter[239441]: Feb 1 04:35:31 localhost openstack_network_exporter[239441]: ERROR 09:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:35:31 localhost openstack_network_exporter[239441]: Feb 1 04:35:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:35:31 localhost podman[274585]: 2026-02-01 09:35:31.72070502 +0000 UTC m=+0.078163196 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:35:31 localhost podman[274585]: 2026-02-01 09:35:31.75554398 +0000 UTC m=+0.113002196 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:35:31 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:35:31 localhost journal[202460]: End of file while reading data: Input/output error Feb 1 04:35:31 localhost systemd[1]: libpod-eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d.scope: Deactivated successfully. Feb 1 04:35:31 localhost systemd[1]: libpod-eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d.scope: Consumed 19.957s CPU time. Feb 1 04:35:31 localhost podman[274571]: 2026-02-01 09:35:31.858484539 +0000 UTC m=+5.866313022 container died eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Feb 1 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d-userdata-shm.mount: Deactivated successfully. Feb 1 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay-50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2-merged.mount: Deactivated successfully. Feb 1 04:35:32 localhost podman[274571]: 2026-02-01 09:35:32.017290776 +0000 UTC m=+6.025119259 container cleanup eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Feb 1 04:35:32 localhost podman[274571]: nova_compute Feb 1 04:35:32 localhost podman[274636]: error opening file `/run/crun/eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d/status`: No such file or directory Feb 1 04:35:32 localhost podman[274625]: 2026-02-01 09:35:32.105706184 +0000 UTC m=+0.059060446 container cleanup eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:35:32 localhost podman[274625]: nova_compute Feb 1 04:35:32 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Feb 1 04:35:32 localhost systemd[1]: Stopped nova_compute container. Feb 1 04:35:32 localhost systemd[1]: Starting nova_compute container... Feb 1 04:35:32 localhost systemd[1]: Started libcrun container. Feb 1 04:35:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50335b3aef2dc7c956f076c63f876d868e187cdff53ecb08c14772ed64bf3dd2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:32 localhost podman[274638]: 2026-02-01 09:35:32.265813431 +0000 UTC m=+0.102681133 container init eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3) Feb 1 04:35:32 localhost podman[274638]: 2026-02-01 09:35:32.274844446 +0000 UTC m=+0.111712138 container start eb1dd5767f794ac195b0aa33c503ccf2dcea5a41ebbc2adad9684a802cae057d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm) Feb 1 04:35:32 localhost podman[274638]: nova_compute Feb 1 04:35:32 localhost nova_compute[274651]: + sudo -E kolla_set_configs Feb 1 04:35:32 localhost systemd[1]: Started nova_compute container. Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Validating config file Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying service configuration files Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /etc/ceph Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Creating directory /etc/ceph Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/ceph Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Writing out command to execute Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:35:32 localhost nova_compute[274651]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 1 04:35:32 localhost nova_compute[274651]: ++ cat /run_command Feb 1 04:35:32 localhost nova_compute[274651]: + CMD=nova-compute Feb 1 04:35:32 localhost nova_compute[274651]: + ARGS= Feb 1 04:35:32 localhost nova_compute[274651]: + sudo kolla_copy_cacerts Feb 1 04:35:32 localhost nova_compute[274651]: + [[ ! -n '' ]] Feb 1 04:35:32 localhost nova_compute[274651]: + . kolla_extend_start Feb 1 04:35:32 localhost nova_compute[274651]: Running command: 'nova-compute' Feb 1 04:35:32 localhost nova_compute[274651]: + echo 'Running command: '\''nova-compute'\''' Feb 1 04:35:32 localhost nova_compute[274651]: + umask 0022 Feb 1 04:35:32 localhost nova_compute[274651]: + exec nova-compute Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.112 274655 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.113 274655 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.113 274655 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.113 274655 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.239 274655 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.261 274655 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.261 274655 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.648 274655 INFO nova.virt.driver [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.768 274655 INFO nova.compute.provider_config [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.776 274655 DEBUG oslo_concurrency.lockutils [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.776 274655 DEBUG oslo_concurrency.lockutils [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.776 274655 DEBUG oslo_concurrency.lockutils [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.777 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.777 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.777 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.777 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.777 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.777 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.778 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] console_host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.779 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.780 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.780 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.780 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.780 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.780 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.780 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.780 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.781 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.781 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.781 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.781 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.781 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.781 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.781 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.782 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.782 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.782 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] host = np0005604212.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.782 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.782 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.782 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.782 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.783 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.784 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.785 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.786 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.787 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.787 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.787 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.787 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.787 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.787 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.787 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.788 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.789 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.790 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.791 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.791 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.791 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.791 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.791 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.791 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.791 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.792 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.792 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.792 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.792 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.792 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.792 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.792 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.793 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.794 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.795 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.796 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.796 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.796 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.796 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.796 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.796 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.796 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.797 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.797 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.797 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.797 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.797 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.797 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.797 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.798 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.798 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.798 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.798 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.798 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.798 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.798 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.799 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.799 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.799 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.799 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.799 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.799 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.799 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.800 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.800 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.800 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.800 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.800 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.800 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.800 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.801 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.801 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.801 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.801 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.801 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.801 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.801 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.802 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.802 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.802 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.802 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.802 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.802 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.802 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.803 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.804 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.804 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.804 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.804 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.804 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.804 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.804 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.805 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.805 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.805 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.805 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.805 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.805 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.805 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.806 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.807 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.807 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.807 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.807 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.807 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.807 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.807 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.808 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.808 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.808 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.808 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.808 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.808 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.809 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.809 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.809 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.809 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.809 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.809 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.809 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.810 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.811 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.811 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.811 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.811 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.811 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.811 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.811 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.812 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.813 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.813 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.813 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.813 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.813 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.813 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.813 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.814 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.815 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.815 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.815 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.815 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.815 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.815 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.815 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.816 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.817 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.817 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.817 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.817 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.817 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.817 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.817 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.818 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.819 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.819 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.819 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.819 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.819 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.819 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.820 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.821 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.821 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.821 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.821 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.821 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.821 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.821 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.822 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.822 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.822 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.822 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.822 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.822 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.823 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.823 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.823 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.823 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.823 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.823 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.823 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.824 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.824 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.824 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.824 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.824 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.824 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.824 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.825 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.825 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.825 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.825 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.825 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.825 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.825 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.826 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.826 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.826 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.826 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.826 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.826 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.827 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.827 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.827 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.827 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.827 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.827 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.827 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.828 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.829 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.829 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.829 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.829 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.829 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.829 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.829 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.830 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.831 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.831 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.831 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.831 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.831 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.831 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.831 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.832 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.832 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.832 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.832 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.832 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.832 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.832 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.833 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.834 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.834 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.834 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.834 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.834 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.834 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.834 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.835 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.836 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.837 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.837 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.837 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.837 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.837 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.837 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.837 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.838 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.839 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.839 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.839 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.839 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.839 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.839 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.839 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.840 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.841 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.841 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.841 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.841 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.841 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.841 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.842 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.842 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.842 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.842 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.842 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.842 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.843 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.844 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.844 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.844 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.844 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.844 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.844 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.844 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.845 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.845 274655 WARNING oslo_config.cfg [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 1 04:35:34 localhost nova_compute[274651]: live_migration_uri is deprecated for removal in favor of two other options that Feb 1 04:35:34 localhost nova_compute[274651]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 1 04:35:34 localhost nova_compute[274651]: and ``live_migration_inbound_addr`` respectively. Feb 1 04:35:34 localhost nova_compute[274651]: ). Its value may be silently ignored in the future.#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.845 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.845 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.845 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.845 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.845 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.846 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.846 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.846 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.846 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.846 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.846 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.846 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.847 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.847 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.847 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.847 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.847 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.847 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.847 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rbd_secret_uuid = 33fac0b9-80c7-560f-918a-c92d3021ca1e log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.848 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.848 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.848 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.848 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.848 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.848 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.848 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.849 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.849 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.849 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.849 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.849 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.849 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.849 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.850 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.851 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.851 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.851 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.851 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.851 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.851 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.851 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.852 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.852 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.852 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.852 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.852 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.852 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.852 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.853 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.853 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.853 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.853 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.853 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.853 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.853 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.854 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.855 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.855 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.855 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.855 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.855 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.855 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.855 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.856 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.856 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.856 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.856 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.856 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.856 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.856 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.857 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.857 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.857 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.857 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.857 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.857 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.857 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.858 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.859 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.859 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.859 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.859 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.859 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.859 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.859 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.860 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.861 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.861 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.861 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.861 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.861 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.861 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.861 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.862 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.863 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.863 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.863 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.863 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.863 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.863 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.863 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.864 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.864 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.864 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.864 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.864 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.864 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.865 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.865 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.865 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.865 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.865 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.865 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.865 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.866 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.866 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.866 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.866 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.866 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.866 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.866 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.867 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.867 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.867 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.867 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.867 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.867 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.867 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.868 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.868 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.868 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.868 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.868 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.868 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.869 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.869 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.869 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.869 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.869 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.870 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.870 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.870 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.870 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.870 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.870 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.870 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.871 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.871 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.871 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.871 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.871 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.871 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.871 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.872 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.873 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.873 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.873 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.873 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.873 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.873 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.874 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.875 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.875 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.875 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.875 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.875 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.875 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.875 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.876 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.877 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.877 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.877 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.877 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.877 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.877 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.877 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.878 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.879 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.879 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.879 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.879 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.879 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.879 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.879 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.880 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.880 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.880 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.880 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.880 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.880 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.881 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.882 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.882 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.882 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.882 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.882 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.882 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.882 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.883 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.883 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.883 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.883 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.883 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.883 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.884 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.884 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.884 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.884 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.884 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.884 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.884 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.885 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.886 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.886 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.886 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.886 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.886 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.887 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.887 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.887 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.887 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.887 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.887 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.887 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.888 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.888 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.888 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.888 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.888 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.888 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.888 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.889 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.890 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.890 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.890 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.890 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.890 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.890 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.890 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.891 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.891 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.891 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.891 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.891 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.891 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.891 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.892 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.892 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.892 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.892 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.892 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.892 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.892 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.893 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.893 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.893 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.893 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.893 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.893 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.893 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.894 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.895 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.895 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.895 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.895 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.895 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.895 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.895 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.896 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.897 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.897 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.897 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.897 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.897 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.897 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.897 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.898 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.898 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.898 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.898 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.898 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.898 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.898 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.899 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.900 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.900 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.900 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.900 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.900 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.900 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.900 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.901 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.901 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.901 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.901 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.901 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.901 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.901 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.902 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.903 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.903 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.903 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.903 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.903 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.903 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.903 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.904 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.904 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.904 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.904 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.904 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.904 274655 DEBUG oslo_service.service [None req-59b407b1-fd1b-412b-b80d-f0eebc69765f - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.905 274655 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.917 274655 INFO nova.virt.node [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Determined node identity a04bda90-8ccd-4104-8518-038544ff1327 from /var/lib/nova/compute_id#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.918 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.918 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.918 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.919 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.929 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.931 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.932 274655 INFO nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Connection event '1' reason 'None'#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.940 274655 INFO nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Libvirt host capabilities Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: 9037fad6-143b-4373-b625-f89bce657827 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: x86_64 Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome-v4 Feb 1 04:35:34 localhost nova_compute[274651]: AMD Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: tcp Feb 1 04:35:34 localhost nova_compute[274651]: rdma Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: 16116612 Feb 1 04:35:34 localhost nova_compute[274651]: 4029153 Feb 1 04:35:34 localhost nova_compute[274651]: 0 Feb 1 04:35:34 localhost nova_compute[274651]: 0 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: selinux Feb 1 04:35:34 localhost nova_compute[274651]: 0 Feb 1 04:35:34 localhost nova_compute[274651]: system_u:system_r:svirt_t:s0 Feb 1 04:35:34 localhost nova_compute[274651]: system_u:system_r:svirt_tcg_t:s0 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: dac Feb 1 04:35:34 localhost nova_compute[274651]: 0 Feb 1 04:35:34 localhost nova_compute[274651]: +107:+107 Feb 1 04:35:34 localhost nova_compute[274651]: +107:+107 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: hvm Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: 32 Feb 1 04:35:34 localhost nova_compute[274651]: /usr/libexec/qemu-kvm Feb 1 04:35:34 localhost nova_compute[274651]: pc-i440fx-rhel7.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.8.0 Feb 1 04:35:34 localhost nova_compute[274651]: q35 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.4.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.5.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.3.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel7.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.4.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.2.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.2.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.0.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.0.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.1.0 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: hvm Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: 64 Feb 1 04:35:34 localhost nova_compute[274651]: /usr/libexec/qemu-kvm Feb 1 04:35:34 localhost nova_compute[274651]: pc-i440fx-rhel7.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.8.0 Feb 1 04:35:34 localhost nova_compute[274651]: q35 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.4.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.5.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.3.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel7.6.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.4.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.2.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.2.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.0.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.0.0 Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel8.1.0 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: #033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.942 274655 DEBUG nova.virt.libvirt.volume.mount [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.946 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 1 04:35:34 localhost nova_compute[274651]: 2026-02-01 09:35:34.952 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: /usr/libexec/qemu-kvm Feb 1 04:35:34 localhost nova_compute[274651]: kvm Feb 1 04:35:34 localhost nova_compute[274651]: pc-q35-rhel9.8.0 Feb 1 04:35:34 localhost nova_compute[274651]: i686 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: rom Feb 1 04:35:34 localhost nova_compute[274651]: pflash Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: yes Feb 1 04:35:34 localhost nova_compute[274651]: no Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: no Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: on Feb 1 04:35:34 localhost nova_compute[274651]: off Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: on Feb 1 04:35:34 localhost nova_compute[274651]: off Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:34 localhost nova_compute[274651]: AMD Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: 486 Feb 1 04:35:34 localhost nova_compute[274651]: 486-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell-IBRS Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell-noTSX Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell-noTSX-IBRS Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Broadwell-v4 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cascadelake-Server Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cascadelake-Server-noTSX Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cascadelake-Server-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cascadelake-Server-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cascadelake-Server-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cascadelake-Server-v4 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cascadelake-Server-v5 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: ClearwaterForest Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: ClearwaterForest-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Conroe Feb 1 04:35:34 localhost nova_compute[274651]: Conroe-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Cooperlake Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cooperlake-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Cooperlake-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Denverton Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Denverton-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Denverton-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Denverton-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Dhyana Feb 1 04:35:34 localhost nova_compute[274651]: Dhyana-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Dhyana-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Genoa Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Genoa-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Genoa-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-IBPB Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Milan Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Milan-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Milan-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Milan-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome-v4 Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Rome-v5 Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Turin Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-Turin-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-v1 Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-v2 Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-v4 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: EPYC-v5 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: GraniteRapids Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: GraniteRapids-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: GraniteRapids-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: GraniteRapids-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell-IBRS Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell-noTSX Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell-noTSX-IBRS Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Haswell-v4 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-noTSX Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-v1 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-v2 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-v3 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-v4 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-v5 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-v6 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Icelake-Server-v7 Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:34 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Penryn Feb 1 04:35:35 localhost nova_compute[274651]: Penryn-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Westmere Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v2 Feb 1 04:35:35 localhost nova_compute[274651]: athlon Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: athlon-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: kvm32 Feb 1 04:35:35 localhost nova_compute[274651]: kvm32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: n270 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: n270-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pentium Feb 1 04:35:35 localhost nova_compute[274651]: pentium-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: phenom Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: phenom-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu32 Feb 1 04:35:35 localhost nova_compute[274651]: qemu32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: anonymous Feb 1 04:35:35 localhost nova_compute[274651]: memfd Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: disk Feb 1 04:35:35 localhost nova_compute[274651]: cdrom Feb 1 04:35:35 localhost nova_compute[274651]: floppy Feb 1 04:35:35 localhost nova_compute[274651]: lun Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: fdc Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: sata Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: vnc Feb 1 04:35:35 localhost nova_compute[274651]: egl-headless Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: subsystem Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: mandatory Feb 1 04:35:35 localhost nova_compute[274651]: requisite Feb 1 04:35:35 localhost nova_compute[274651]: optional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: pci Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: random Feb 1 04:35:35 localhost nova_compute[274651]: egd Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: path Feb 1 04:35:35 localhost nova_compute[274651]: handle Feb 1 04:35:35 localhost nova_compute[274651]: virtiofs Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: tpm-tis Feb 1 04:35:35 localhost nova_compute[274651]: tpm-crb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: emulator Feb 1 04:35:35 localhost nova_compute[274651]: external Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 2.0 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: passt Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: isa Feb 1 04:35:35 localhost nova_compute[274651]: hyperv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: null Feb 1 04:35:35 localhost nova_compute[274651]: vc Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: dev Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: pipe Feb 1 04:35:35 localhost nova_compute[274651]: stdio Feb 1 04:35:35 localhost nova_compute[274651]: udp Feb 1 04:35:35 localhost nova_compute[274651]: tcp Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: qemu-vdagent Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: relaxed Feb 1 04:35:35 localhost nova_compute[274651]: vapic Feb 1 04:35:35 localhost nova_compute[274651]: spinlocks Feb 1 04:35:35 localhost nova_compute[274651]: vpindex Feb 1 04:35:35 localhost nova_compute[274651]: runtime Feb 1 04:35:35 localhost nova_compute[274651]: synic Feb 1 04:35:35 localhost nova_compute[274651]: stimer Feb 1 04:35:35 localhost nova_compute[274651]: reset Feb 1 04:35:35 localhost nova_compute[274651]: vendor_id Feb 1 04:35:35 localhost nova_compute[274651]: frequencies Feb 1 04:35:35 localhost nova_compute[274651]: reenlightenment Feb 1 04:35:35 localhost nova_compute[274651]: tlbflush Feb 1 04:35:35 localhost nova_compute[274651]: ipi Feb 1 04:35:35 localhost nova_compute[274651]: avic Feb 1 04:35:35 localhost nova_compute[274651]: emsr_bitmap Feb 1 04:35:35 localhost nova_compute[274651]: xmm_input Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 4095 Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Linux KVM Hv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:34.958 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: /usr/libexec/qemu-kvm Feb 1 04:35:35 localhost nova_compute[274651]: kvm Feb 1 04:35:35 localhost nova_compute[274651]: pc-i440fx-rhel7.6.0 Feb 1 04:35:35 localhost nova_compute[274651]: i686 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: rom Feb 1 04:35:35 localhost nova_compute[274651]: pflash Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: yes Feb 1 04:35:35 localhost nova_compute[274651]: no Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: no Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:35 localhost nova_compute[274651]: AMD Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 486 Feb 1 04:35:35 localhost nova_compute[274651]: 486-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ClearwaterForest Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ClearwaterForest-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Conroe Feb 1 04:35:35 localhost nova_compute[274651]: Conroe-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-IBPB Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v4 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v5 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Turin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Turin-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v1 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v2 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v6 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v7 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Penryn Feb 1 04:35:35 localhost nova_compute[274651]: Penryn-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Westmere Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v2 Feb 1 04:35:35 localhost nova_compute[274651]: athlon Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: athlon-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: kvm32 Feb 1 04:35:35 localhost nova_compute[274651]: kvm32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: n270 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: n270-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pentium Feb 1 04:35:35 localhost nova_compute[274651]: pentium-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: phenom Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: phenom-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu32 Feb 1 04:35:35 localhost nova_compute[274651]: qemu32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: anonymous Feb 1 04:35:35 localhost nova_compute[274651]: memfd Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: disk Feb 1 04:35:35 localhost nova_compute[274651]: cdrom Feb 1 04:35:35 localhost nova_compute[274651]: floppy Feb 1 04:35:35 localhost nova_compute[274651]: lun Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ide Feb 1 04:35:35 localhost nova_compute[274651]: fdc Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: sata Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: vnc Feb 1 04:35:35 localhost nova_compute[274651]: egl-headless Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: subsystem Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: mandatory Feb 1 04:35:35 localhost nova_compute[274651]: requisite Feb 1 04:35:35 localhost nova_compute[274651]: optional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: pci Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: random Feb 1 04:35:35 localhost nova_compute[274651]: egd Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: path Feb 1 04:35:35 localhost nova_compute[274651]: handle Feb 1 04:35:35 localhost nova_compute[274651]: virtiofs Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: tpm-tis Feb 1 04:35:35 localhost nova_compute[274651]: tpm-crb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: emulator Feb 1 04:35:35 localhost nova_compute[274651]: external Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 2.0 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: passt Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: isa Feb 1 04:35:35 localhost nova_compute[274651]: hyperv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: null Feb 1 04:35:35 localhost nova_compute[274651]: vc Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: dev Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: pipe Feb 1 04:35:35 localhost nova_compute[274651]: stdio Feb 1 04:35:35 localhost nova_compute[274651]: udp Feb 1 04:35:35 localhost nova_compute[274651]: tcp Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: qemu-vdagent Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: relaxed Feb 1 04:35:35 localhost nova_compute[274651]: vapic Feb 1 04:35:35 localhost nova_compute[274651]: spinlocks Feb 1 04:35:35 localhost nova_compute[274651]: vpindex Feb 1 04:35:35 localhost nova_compute[274651]: runtime Feb 1 04:35:35 localhost nova_compute[274651]: synic Feb 1 04:35:35 localhost nova_compute[274651]: stimer Feb 1 04:35:35 localhost nova_compute[274651]: reset Feb 1 04:35:35 localhost nova_compute[274651]: vendor_id Feb 1 04:35:35 localhost nova_compute[274651]: frequencies Feb 1 04:35:35 localhost nova_compute[274651]: reenlightenment Feb 1 04:35:35 localhost nova_compute[274651]: tlbflush Feb 1 04:35:35 localhost nova_compute[274651]: ipi Feb 1 04:35:35 localhost nova_compute[274651]: avic Feb 1 04:35:35 localhost nova_compute[274651]: emsr_bitmap Feb 1 04:35:35 localhost nova_compute[274651]: xmm_input Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 4095 Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Linux KVM Hv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.017 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.023 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: /usr/libexec/qemu-kvm Feb 1 04:35:35 localhost nova_compute[274651]: kvm Feb 1 04:35:35 localhost nova_compute[274651]: pc-q35-rhel9.8.0 Feb 1 04:35:35 localhost nova_compute[274651]: x86_64 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: efi Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 1 04:35:35 localhost nova_compute[274651]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 1 04:35:35 localhost nova_compute[274651]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 1 04:35:35 localhost nova_compute[274651]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: rom Feb 1 04:35:35 localhost nova_compute[274651]: pflash Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: yes Feb 1 04:35:35 localhost nova_compute[274651]: no Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: yes Feb 1 04:35:35 localhost nova_compute[274651]: no Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:35 localhost nova_compute[274651]: AMD Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 486 Feb 1 04:35:35 localhost nova_compute[274651]: 486-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ClearwaterForest Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ClearwaterForest-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Conroe Feb 1 04:35:35 localhost nova_compute[274651]: Conroe-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-IBPB Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v4 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v5 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Turin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Turin-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v1 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v2 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v6 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v7 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Penryn Feb 1 04:35:35 localhost nova_compute[274651]: Penryn-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Westmere Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v2 Feb 1 04:35:35 localhost nova_compute[274651]: athlon Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: athlon-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: kvm32 Feb 1 04:35:35 localhost nova_compute[274651]: kvm32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: n270 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: n270-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pentium Feb 1 04:35:35 localhost nova_compute[274651]: pentium-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: phenom Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: phenom-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu32 Feb 1 04:35:35 localhost nova_compute[274651]: qemu32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: anonymous Feb 1 04:35:35 localhost nova_compute[274651]: memfd Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: disk Feb 1 04:35:35 localhost nova_compute[274651]: cdrom Feb 1 04:35:35 localhost nova_compute[274651]: floppy Feb 1 04:35:35 localhost nova_compute[274651]: lun Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: fdc Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: sata Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: vnc Feb 1 04:35:35 localhost nova_compute[274651]: egl-headless Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: subsystem Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: mandatory Feb 1 04:35:35 localhost nova_compute[274651]: requisite Feb 1 04:35:35 localhost nova_compute[274651]: optional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: pci Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: random Feb 1 04:35:35 localhost nova_compute[274651]: egd Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: path Feb 1 04:35:35 localhost nova_compute[274651]: handle Feb 1 04:35:35 localhost nova_compute[274651]: virtiofs Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: tpm-tis Feb 1 04:35:35 localhost nova_compute[274651]: tpm-crb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: emulator Feb 1 04:35:35 localhost nova_compute[274651]: external Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 2.0 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: passt Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: isa Feb 1 04:35:35 localhost nova_compute[274651]: hyperv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: null Feb 1 04:35:35 localhost nova_compute[274651]: vc Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: dev Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: pipe Feb 1 04:35:35 localhost nova_compute[274651]: stdio Feb 1 04:35:35 localhost nova_compute[274651]: udp Feb 1 04:35:35 localhost nova_compute[274651]: tcp Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: qemu-vdagent Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: relaxed Feb 1 04:35:35 localhost nova_compute[274651]: vapic Feb 1 04:35:35 localhost nova_compute[274651]: spinlocks Feb 1 04:35:35 localhost nova_compute[274651]: vpindex Feb 1 04:35:35 localhost nova_compute[274651]: runtime Feb 1 04:35:35 localhost nova_compute[274651]: synic Feb 1 04:35:35 localhost nova_compute[274651]: stimer Feb 1 04:35:35 localhost nova_compute[274651]: reset Feb 1 04:35:35 localhost nova_compute[274651]: vendor_id Feb 1 04:35:35 localhost nova_compute[274651]: frequencies Feb 1 04:35:35 localhost nova_compute[274651]: reenlightenment Feb 1 04:35:35 localhost nova_compute[274651]: tlbflush Feb 1 04:35:35 localhost nova_compute[274651]: ipi Feb 1 04:35:35 localhost nova_compute[274651]: avic Feb 1 04:35:35 localhost nova_compute[274651]: emsr_bitmap Feb 1 04:35:35 localhost nova_compute[274651]: xmm_input Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 4095 Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Linux KVM Hv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.092 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: /usr/libexec/qemu-kvm Feb 1 04:35:35 localhost nova_compute[274651]: kvm Feb 1 04:35:35 localhost nova_compute[274651]: pc-i440fx-rhel7.6.0 Feb 1 04:35:35 localhost nova_compute[274651]: x86_64 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: rom Feb 1 04:35:35 localhost nova_compute[274651]: pflash Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: yes Feb 1 04:35:35 localhost nova_compute[274651]: no Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: no Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:35 localhost nova_compute[274651]: AMD Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 486 Feb 1 04:35:35 localhost nova_compute[274651]: 486-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Broadwell-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cascadelake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ClearwaterForest Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ClearwaterForest-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Conroe Feb 1 04:35:35 localhost nova_compute[274651]: Conroe-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Cooperlake-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Denverton-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Dhyana-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Genoa-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-IBPB Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Milan-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v4 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Rome-v5 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Turin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-Turin-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v1 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v2 Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: EPYC-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: GraniteRapids-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Haswell-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-noTSX Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v6 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Icelake-Server-v7 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: IvyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: KnightsMill-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Nehalem-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G1-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G4-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Opteron_G5-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Penryn Feb 1 04:35:35 localhost nova_compute[274651]: Penryn-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: SandyBridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SapphireRapids-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: SierraForest-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Client-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-noTSX-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Skylake-Server-v5 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v2 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v3 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Snowridge-v4 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Westmere Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-IBRS Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Westmere-v2 Feb 1 04:35:35 localhost nova_compute[274651]: athlon Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: athlon-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: core2duo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: coreduo-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: kvm32 Feb 1 04:35:35 localhost nova_compute[274651]: kvm32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64 Feb 1 04:35:35 localhost nova_compute[274651]: kvm64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: n270 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: n270-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pentium Feb 1 04:35:35 localhost nova_compute[274651]: pentium-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2 Feb 1 04:35:35 localhost nova_compute[274651]: pentium2-v1 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3 Feb 1 04:35:35 localhost nova_compute[274651]: pentium3-v1 Feb 1 04:35:35 localhost nova_compute[274651]: phenom Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: phenom-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu32 Feb 1 04:35:35 localhost nova_compute[274651]: qemu32-v1 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64 Feb 1 04:35:35 localhost nova_compute[274651]: qemu64-v1 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: anonymous Feb 1 04:35:35 localhost nova_compute[274651]: memfd Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: disk Feb 1 04:35:35 localhost nova_compute[274651]: cdrom Feb 1 04:35:35 localhost nova_compute[274651]: floppy Feb 1 04:35:35 localhost nova_compute[274651]: lun Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: ide Feb 1 04:35:35 localhost nova_compute[274651]: fdc Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: sata Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: vnc Feb 1 04:35:35 localhost nova_compute[274651]: egl-headless Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: subsystem Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: mandatory Feb 1 04:35:35 localhost nova_compute[274651]: requisite Feb 1 04:35:35 localhost nova_compute[274651]: optional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: pci Feb 1 04:35:35 localhost nova_compute[274651]: scsi Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: virtio Feb 1 04:35:35 localhost nova_compute[274651]: virtio-transitional Feb 1 04:35:35 localhost nova_compute[274651]: virtio-non-transitional Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: random Feb 1 04:35:35 localhost nova_compute[274651]: egd Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: path Feb 1 04:35:35 localhost nova_compute[274651]: handle Feb 1 04:35:35 localhost nova_compute[274651]: virtiofs Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: tpm-tis Feb 1 04:35:35 localhost nova_compute[274651]: tpm-crb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: emulator Feb 1 04:35:35 localhost nova_compute[274651]: external Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 2.0 Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: usb Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: qemu Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: builtin Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: default Feb 1 04:35:35 localhost nova_compute[274651]: passt Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: isa Feb 1 04:35:35 localhost nova_compute[274651]: hyperv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: null Feb 1 04:35:35 localhost nova_compute[274651]: vc Feb 1 04:35:35 localhost nova_compute[274651]: pty Feb 1 04:35:35 localhost nova_compute[274651]: dev Feb 1 04:35:35 localhost nova_compute[274651]: file Feb 1 04:35:35 localhost nova_compute[274651]: pipe Feb 1 04:35:35 localhost nova_compute[274651]: stdio Feb 1 04:35:35 localhost nova_compute[274651]: udp Feb 1 04:35:35 localhost nova_compute[274651]: tcp Feb 1 04:35:35 localhost nova_compute[274651]: unix Feb 1 04:35:35 localhost nova_compute[274651]: qemu-vdagent Feb 1 04:35:35 localhost nova_compute[274651]: dbus Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: relaxed Feb 1 04:35:35 localhost nova_compute[274651]: vapic Feb 1 04:35:35 localhost nova_compute[274651]: spinlocks Feb 1 04:35:35 localhost nova_compute[274651]: vpindex Feb 1 04:35:35 localhost nova_compute[274651]: runtime Feb 1 04:35:35 localhost nova_compute[274651]: synic Feb 1 04:35:35 localhost nova_compute[274651]: stimer Feb 1 04:35:35 localhost nova_compute[274651]: reset Feb 1 04:35:35 localhost nova_compute[274651]: vendor_id Feb 1 04:35:35 localhost nova_compute[274651]: frequencies Feb 1 04:35:35 localhost nova_compute[274651]: reenlightenment Feb 1 04:35:35 localhost nova_compute[274651]: tlbflush Feb 1 04:35:35 localhost nova_compute[274651]: ipi Feb 1 04:35:35 localhost nova_compute[274651]: avic Feb 1 04:35:35 localhost nova_compute[274651]: emsr_bitmap Feb 1 04:35:35 localhost nova_compute[274651]: xmm_input Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: 4095 Feb 1 04:35:35 localhost nova_compute[274651]: on Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: off Feb 1 04:35:35 localhost nova_compute[274651]: Linux KVM Hv Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: Feb 1 04:35:35 localhost nova_compute[274651]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.158 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.159 274655 INFO nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Secure Boot support detected#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.161 274655 INFO nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.161 274655 INFO nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.177 274655 DEBUG nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.213 274655 INFO nova.virt.node [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Determined node identity a04bda90-8ccd-4104-8518-038544ff1327 from /var/lib/nova/compute_id#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.236 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Verified node a04bda90-8ccd-4104-8518-038544ff1327 matches my host np0005604212.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.299 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.305 274655 DEBUG nova.virt.libvirt.vif [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-01T08:24:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005604212.localdomain',hostname='test',id=2,image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-01T08:24:22Z,launched_on='np0005604212.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005604212.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='79df39cba1c14309b68e8b61518619fd',ramdisk_id='',reservation_id='r-pgkx81ko',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-02-01T08:24:22Z,user_data=None,user_id='7567a560936c417c92d242d856b00bb3',uuid=08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.305 274655 DEBUG nova.network.os_vif_util [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Converting VIF {"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.306 274655 DEBUG nova.network.os_vif_util [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.308 274655 DEBUG os_vif [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.354 274655 DEBUG ovsdbapp.backend.ovs_idl [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.354 274655 DEBUG ovsdbapp.backend.ovs_idl [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.354 274655 DEBUG ovsdbapp.backend.ovs_idl [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.355 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.355 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.356 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.356 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.357 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.361 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.377 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.377 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.377 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:35:35 localhost nova_compute[274651]: 2026-02-01 09:35:35.378 274655 INFO oslo.privsep.daemon [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp2c5hl46k/privsep.sock']#033[00m Feb 1 04:35:36 localhost python3.9[274887]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.028 274655 INFO oslo.privsep.daemon [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:35.920 274888 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:35.925 274888 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:35.928 274888 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:35.929 274888 INFO oslo.privsep.daemon [-] privsep daemon running as pid 274888#033[00m Feb 1 04:35:36 localhost systemd[1]: Started libpod-conmon-119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5.scope. Feb 1 04:35:36 localhost systemd[1]: Started libcrun container. Feb 1 04:35:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a85326dc68a867f6078cb5530dc5fa0fc91ff2655c36190361d9b22973148e1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a85326dc68a867f6078cb5530dc5fa0fc91ff2655c36190361d9b22973148e1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a85326dc68a867f6078cb5530dc5fa0fc91ff2655c36190361d9b22973148e1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 1 04:35:36 localhost podman[274915]: 2026-02-01 09:35:36.252810832 +0000 UTC m=+0.119510733 container init 119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}) Feb 1 04:35:36 localhost systemd[1]: tmp-crun.K5IwQU.mount: Deactivated successfully. Feb 1 04:35:36 localhost podman[274915]: 2026-02-01 09:35:36.266704605 +0000 UTC m=+0.133404536 container start 119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:35:36 localhost python3.9[274887]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Applying nova statedir ownership Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 already 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/console.log Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/e54cbbfc71830153054a3aecdf4a80059e3e0e5d Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-e54cbbfc71830153054a3aecdf4a80059e3e0e5d Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9 Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/d301d14069645d8c23fee2987984776b3e88a570e1aa96d6cf3e31fa880385fd Feb 1 04:35:36 localhost nova_compute_init[274936]: INFO:nova_statedir:Nova statedir ownership complete Feb 1 04:35:36 localhost systemd[1]: libpod-119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5.scope: Deactivated successfully. Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.332 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.333 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09cac1be-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.333 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09cac1be-46, col_values=(('external_ids', {'iface-id': '09cac1be-46e2-4a31-8306-e6f4f0401b19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:11:63', 'vm-uuid': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.333 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.334 274655 INFO os_vif [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46')#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.334 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.341 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.342 274655 INFO nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Feb 1 04:35:36 localhost podman[274937]: 2026-02-01 09:35:36.349865223 +0000 UTC m=+0.067694029 container died 119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init) Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.425 274655 DEBUG oslo_concurrency.lockutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.425 274655 DEBUG oslo_concurrency.lockutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.425 274655 DEBUG oslo_concurrency.lockutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.426 274655 DEBUG nova.compute.resource_tracker [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.427 274655 DEBUG oslo_concurrency.processutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:35:36 localhost podman[274948]: 2026-02-01 09:35:36.439256441 +0000 UTC m=+0.121805565 container cleanup 119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:35:36 localhost systemd[1]: libpod-conmon-119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5.scope: Deactivated successfully. Feb 1 04:35:36 localhost nova_compute[274651]: 2026-02-01 09:35:36.885 274655 DEBUG oslo_concurrency.processutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.121 274655 DEBUG nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.122 274655 DEBUG nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:35:37 localhost systemd[1]: var-lib-containers-storage-overlay-2a85326dc68a867f6078cb5530dc5fa0fc91ff2655c36190361d9b22973148e1-merged.mount: Deactivated successfully. Feb 1 04:35:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-119fb3d740876b4bc8730560f8d5e57ff04b2bf1ad6f0ec6b1ab6339a69b57d5-userdata-shm.mount: Deactivated successfully. Feb 1 04:35:37 localhost systemd[1]: session-59.scope: Deactivated successfully. Feb 1 04:35:37 localhost systemd[1]: session-59.scope: Consumed 1min 21.956s CPU time. Feb 1 04:35:37 localhost systemd-logind[759]: Session 59 logged out. Waiting for processes to exit. Feb 1 04:35:37 localhost systemd-logind[759]: Removed session 59. Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.376 274655 WARNING nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.378 274655 DEBUG nova.compute.resource_tracker [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11925MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.378 274655 DEBUG oslo_concurrency.lockutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.379 274655 DEBUG oslo_concurrency.lockutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.555 274655 DEBUG nova.compute.resource_tracker [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.556 274655 DEBUG nova.compute.resource_tracker [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.556 274655 DEBUG nova.compute.resource_tracker [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.580 274655 DEBUG nova.scheduler.client.report [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:35:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45220 DF PROTO=TCP SPT=34540 DPT=9102 SEQ=439331414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D07CCA20000000001030307) Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.660 274655 DEBUG nova.scheduler.client.report [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.660 274655 DEBUG nova.compute.provider_tree [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.676 274655 DEBUG nova.scheduler.client.report [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.708 274655 DEBUG nova.scheduler.client.report [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI2,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:35:37 localhost nova_compute[274651]: 2026-02-01 09:35:37.770 274655 DEBUG oslo_concurrency.processutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.194 274655 DEBUG oslo_concurrency.processutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.201 274655 DEBUG nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Feb 1 04:35:38 localhost nova_compute[274651]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.201 274655 INFO nova.virt.libvirt.host [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] kernel doesn't support AMD SEV#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.203 274655 DEBUG nova.compute.provider_tree [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.204 274655 DEBUG nova.virt.libvirt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.228 274655 DEBUG nova.scheduler.client.report [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.269 274655 DEBUG nova.compute.resource_tracker [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.269 274655 DEBUG oslo_concurrency.lockutils [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.890s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.270 274655 DEBUG nova.service [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.297 274655 DEBUG nova.service [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Feb 1 04:35:38 localhost nova_compute[274651]: 2026-02-01 09:35:38.298 274655 DEBUG nova.servicegroup.drivers.db [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] DB_Driver: join new ServiceGroup member np0005604212.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Feb 1 04:35:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:35:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45221 DF PROTO=TCP SPT=34540 DPT=9102 SEQ=439331414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D07D0B80000000001030307) Feb 1 04:35:38 localhost podman[275036]: 2026-02-01 09:35:38.7116757 +0000 UTC m=+0.072221497 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:35:38 localhost podman[275036]: 2026-02-01 09:35:38.721394715 +0000 UTC m=+0.081940572 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:35:38 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:35:39 localhost nova_compute[274651]: 2026-02-01 09:35:39.300 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:39 localhost nova_compute[274651]: 2026-02-01 09:35:39.337 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Triggering sync for uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 1 04:35:39 localhost nova_compute[274651]: 2026-02-01 09:35:39.337 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:35:39 localhost nova_compute[274651]: 2026-02-01 09:35:39.338 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:35:39 localhost nova_compute[274651]: 2026-02-01 09:35:39.338 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:35:39 localhost nova_compute[274651]: 2026-02-01 09:35:39.380 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:35:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29097 DF PROTO=TCP SPT=46240 DPT=9102 SEQ=3484246408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D07D3B80000000001030307) Feb 1 04:35:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:35:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:35:39 localhost nova_compute[274651]: 2026-02-01 09:35:39.690 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:39 localhost podman[275054]: 2026-02-01 09:35:39.725167879 +0000 UTC m=+0.081781857 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:35:39 localhost podman[275054]: 2026-02-01 09:35:39.736293558 +0000 UTC m=+0.092907526 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:35:39 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:35:39 localhost podman[275055]: 2026-02-01 09:35:39.813254897 +0000 UTC m=+0.164131980 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:35:39 localhost podman[275055]: 2026-02-01 09:35:39.856409879 +0000 UTC m=+0.207286992 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 1 04:35:39 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:35:40 localhost nova_compute[274651]: 2026-02-01 09:35:40.360 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45222 DF PROTO=TCP SPT=34540 DPT=9102 SEQ=439331414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D07D8B80000000001030307) Feb 1 04:35:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10565 DF PROTO=TCP SPT=33990 DPT=9102 SEQ=3149604942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D07DBB80000000001030307) Feb 1 04:35:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:35:41.697 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:35:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:35:41.698 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:35:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:35:41.698 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:35:44 localhost nova_compute[274651]: 2026-02-01 09:35:44.729 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45223 DF PROTO=TCP SPT=34540 DPT=9102 SEQ=439331414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D07E8780000000001030307) Feb 1 04:35:45 localhost nova_compute[274651]: 2026-02-01 09:35:45.361 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:35:49 localhost podman[275102]: 2026-02-01 09:35:49.699600733 +0000 UTC m=+0.063830671 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, release=1769056855, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc.) Feb 1 04:35:49 localhost podman[275102]: 2026-02-01 09:35:49.739361011 +0000 UTC m=+0.103590969 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:35:49 localhost nova_compute[274651]: 2026-02-01 09:35:49.768 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:49 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:35:50 localhost nova_compute[274651]: 2026-02-01 09:35:50.364 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:35:51 localhost podman[275122]: 2026-02-01 09:35:51.720116241 +0000 UTC m=+0.083458441 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 1 04:35:51 localhost podman[275122]: 2026-02-01 09:35:51.733466548 +0000 UTC m=+0.096808778 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:35:51 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:35:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45224 DF PROTO=TCP SPT=34540 DPT=9102 SEQ=439331414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0809BF0000000001030307) Feb 1 04:35:53 localhost podman[236886]: time="2026-02-01T09:35:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:35:53 localhost podman[236886]: @ - - [01/Feb/2026:09:35:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:35:54 localhost podman[236886]: @ - - [01/Feb/2026:09:35:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17285 "" "Go-http-client/1.1" Feb 1 04:35:54 localhost nova_compute[274651]: 2026-02-01 09:35:54.646 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:35:54.648 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:35:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:35:54.650 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:35:54 localhost nova_compute[274651]: 2026-02-01 09:35:54.772 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:55 localhost nova_compute[274651]: 2026-02-01 09:35:55.393 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:35:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:35:55.652 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:35:59 localhost nova_compute[274651]: 2026-02-01 09:35:59.774 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:00 localhost nova_compute[274651]: 2026-02-01 09:36:00.395 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:01 localhost openstack_network_exporter[239441]: ERROR 09:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:36:01 localhost openstack_network_exporter[239441]: Feb 1 04:36:01 localhost openstack_network_exporter[239441]: ERROR 09:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:36:01 localhost openstack_network_exporter[239441]: Feb 1 04:36:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:36:02 localhost podman[275141]: 2026-02-01 09:36:02.719101418 +0000 UTC m=+0.079918033 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:36:02 localhost podman[275141]: 2026-02-01 09:36:02.729357791 +0000 UTC m=+0.090174446 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:36:02 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:36:03 localhost sshd[275163]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:36:04 localhost nova_compute[274651]: 2026-02-01 09:36:04.811 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:05 localhost nova_compute[274651]: 2026-02-01 09:36:05.398 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60692 DF PROTO=TCP SPT=52078 DPT=9102 SEQ=39011501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0841D10000000001030307) Feb 1 04:36:08 localhost sshd[275165]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:36:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60693 DF PROTO=TCP SPT=52078 DPT=9102 SEQ=39011501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0845F80000000001030307) Feb 1 04:36:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:36:09 localhost podman[275167]: 2026-02-01 09:36:09.18439244 +0000 UTC m=+0.071166826 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:36:09 localhost podman[275167]: 2026-02-01 09:36:09.221297827 +0000 UTC m=+0.108072203 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:36:09 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:36:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45225 DF PROTO=TCP SPT=34540 DPT=9102 SEQ=439331414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0849B80000000001030307) Feb 1 04:36:09 localhost nova_compute[274651]: 2026-02-01 09:36:09.854 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:10 localhost nova_compute[274651]: 2026-02-01 09:36:10.399 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:36:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:36:10 localhost podman[275185]: 2026-02-01 09:36:10.72147991 +0000 UTC m=+0.081632385 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:36:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60694 DF PROTO=TCP SPT=52078 DPT=9102 SEQ=39011501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D084DF80000000001030307) Feb 1 04:36:10 localhost podman[275185]: 2026-02-01 09:36:10.754448847 +0000 UTC m=+0.114601322 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:36:10 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:36:10 localhost podman[275186]: 2026-02-01 09:36:10.82460194 +0000 UTC m=+0.179955489 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:36:10 localhost podman[275186]: 2026-02-01 09:36:10.884883923 +0000 UTC m=+0.240237482 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 1 04:36:10 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:36:10 localhost nova_compute[274651]: 2026-02-01 09:36:10.932 274655 DEBUG nova.compute.manager [None req-beb318c7-49e0-4c41-a306-323b6fd10f86 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:10 localhost nova_compute[274651]: 2026-02-01 09:36:10.937 274655 INFO nova.compute.manager [None req-beb318c7-49e0-4c41-a306-323b6fd10f86 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Retrieving diagnostics#033[00m Feb 1 04:36:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29098 DF PROTO=TCP SPT=46240 DPT=9102 SEQ=3484246408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0851B80000000001030307) Feb 1 04:36:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60695 DF PROTO=TCP SPT=52078 DPT=9102 SEQ=39011501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D085DB80000000001030307) Feb 1 04:36:14 localhost nova_compute[274651]: 2026-02-01 09:36:14.896 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:15 localhost nova_compute[274651]: 2026-02-01 09:36:15.402 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:16 localhost nova_compute[274651]: 2026-02-01 09:36:16.924 274655 DEBUG oslo_concurrency.lockutils [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:16 localhost nova_compute[274651]: 2026-02-01 09:36:16.925 274655 DEBUG oslo_concurrency.lockutils [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:16 localhost nova_compute[274651]: 2026-02-01 09:36:16.926 274655 DEBUG nova.compute.manager [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:16 localhost nova_compute[274651]: 2026-02-01 09:36:16.931 274655 DEBUG nova.compute.manager [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Feb 1 04:36:16 localhost nova_compute[274651]: 2026-02-01 09:36:16.936 274655 DEBUG nova.objects.instance [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'flavor' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:16 localhost nova_compute[274651]: 2026-02-01 09:36:16.990 274655 DEBUG nova.virt.libvirt.driver [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Feb 1 04:36:19 localhost kernel: device tap09cac1be-46 left promiscuous mode Feb 1 04:36:19 localhost NetworkManager[5964]: [1769938579.4850] device (tap09cac1be-46): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.539 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.542 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00051|binding|INFO|Releasing lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 from this chassis (sb_readonly=0) Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00052|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 down in Southbound Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00053|binding|INFO|Removing iface tap09cac1be-46 ovn-installed in OVS Feb 1 04:36:19 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Feb 1 04:36:19 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 57.566s CPU time. Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.553 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:11:63 192.168.0.12'], port_security=['fa:16:3e:86:11:63 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005604212.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '79df39cba1c14309b68e8b61518619fd', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0b065334-69c4-4862-ab2c-0676d50a1918 0dc57611-620a-4a91-b761-dd2b6dc1d570', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5329260b-b0db-417b-bda6-9045427ce15d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09cac1be-46e2-4a31-8306-e6f4f0401b19) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.553 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost systemd-machined[83507]: Machine qemu-1-instance-00000002 terminated. Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.555 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 09cac1be-46e2-4a31-8306-e6f4f0401b19 in datapath 8bdf8183-8467-40ac-933d-a37b0bd3539a unbound from our chassis#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.557 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6aaab6a9-3538-4fc9-b08e-a42b74cabd90 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.557 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8bdf8183-8467-40ac-933d-a37b0bd3539a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.559 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[795f011b-16b1-4a85-b7cd-4abacade3c2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.560 158365 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a namespace which is not needed anymore#033[00m Feb 1 04:36:19 localhost kernel: device tap09cac1be-46 entered promiscuous mode Feb 1 04:36:19 localhost NetworkManager[5964]: [1769938579.7104] manager: (tap09cac1be-46): new Tun device (/org/freedesktop/NetworkManager/Devices/16) Feb 1 04:36:19 localhost systemd-udevd[275235]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00054|binding|INFO|Claiming lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 for this chassis. Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00055|binding|INFO|09cac1be-46e2-4a31-8306-e6f4f0401b19: Claiming fa:16:3e:86:11:63 192.168.0.12 Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.712 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost kernel: device tap09cac1be-46 left promiscuous mode Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.726 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:11:63 192.168.0.12'], port_security=['fa:16:3e:86:11:63 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005604212.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '79df39cba1c14309b68e8b61518619fd', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0b065334-69c4-4862-ab2c-0676d50a1918 0dc57611-620a-4a91-b761-dd2b6dc1d570', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5329260b-b0db-417b-bda6-9045427ce15d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09cac1be-46e2-4a31-8306-e6f4f0401b19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00056|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 ovn-installed in OVS Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00057|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 up in Southbound Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00058|binding|INFO|Releasing lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 from this chassis (sb_readonly=1) Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.732 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00059|if_status|INFO|Not setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 down as sb is readonly Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00060|binding|INFO|Removing iface tap09cac1be-46 ovn-installed in OVS Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.740 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00061|binding|INFO|Releasing lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 from this chassis (sb_readonly=0) Feb 1 04:36:19 localhost ovn_controller[152492]: 2026-02-01T09:36:19Z|00062|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 down in Southbound Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.744 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.757 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:11:63 192.168.0.12'], port_security=['fa:16:3e:86:11:63 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005604212.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '79df39cba1c14309b68e8b61518619fd', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0b065334-69c4-4862-ab2c-0676d50a1918 0dc57611-620a-4a91-b761-dd2b6dc1d570', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5329260b-b0db-417b-bda6-9045427ce15d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09cac1be-46e2-4a31-8306-e6f4f0401b19) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:36:19 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[259896]: [NOTICE] (259900) : haproxy version is 2.8.14-c23fe91 Feb 1 04:36:19 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[259896]: [NOTICE] (259900) : path to executable is /usr/sbin/haproxy Feb 1 04:36:19 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[259896]: [WARNING] (259900) : Exiting Master process... Feb 1 04:36:19 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[259896]: [ALERT] (259900) : Current worker (259902) exited with code 143 (Terminated) Feb 1 04:36:19 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[259896]: [WARNING] (259900) : All workers exited. Exiting... (0) Feb 1 04:36:19 localhost systemd[1]: libpod-d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16.scope: Deactivated successfully. Feb 1 04:36:19 localhost podman[275255]: 2026-02-01 09:36:19.772728277 +0000 UTC m=+0.091561027 container died d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:36:19 localhost systemd[1]: tmp-crun.jFq3Dt.mount: Deactivated successfully. Feb 1 04:36:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:36:19 localhost podman[275255]: 2026-02-01 09:36:19.815276678 +0000 UTC m=+0.134109398 container cleanup d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:36:19 localhost podman[275269]: 2026-02-01 09:36:19.851940358 +0000 UTC m=+0.074468476 container cleanup d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:36:19 localhost systemd[1]: libpod-conmon-d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16.scope: Deactivated successfully. Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.898 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost podman[275281]: 2026-02-01 09:36:19.902425191 +0000 UTC m=+0.094042095 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.7, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, architecture=x86_64) Feb 1 04:36:19 localhost podman[275281]: 2026-02-01 09:36:19.915828309 +0000 UTC m=+0.107445163 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.924 274655 DEBUG nova.compute.manager [req-83b737e8-8fb0-417f-bbbd-5edf2e698d07 req-60080c80-1163-4a51-9934-599a4e0d4ad6 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-unplugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.925 274655 DEBUG oslo_concurrency.lockutils [req-83b737e8-8fb0-417f-bbbd-5edf2e698d07 req-60080c80-1163-4a51-9934-599a4e0d4ad6 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.926 274655 DEBUG oslo_concurrency.lockutils [req-83b737e8-8fb0-417f-bbbd-5edf2e698d07 req-60080c80-1163-4a51-9934-599a4e0d4ad6 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.926 274655 DEBUG oslo_concurrency.lockutils [req-83b737e8-8fb0-417f-bbbd-5edf2e698d07 req-60080c80-1163-4a51-9934-599a4e0d4ad6 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.926 274655 DEBUG nova.compute.manager [req-83b737e8-8fb0-417f-bbbd-5edf2e698d07 req-60080c80-1163-4a51-9934-599a4e0d4ad6 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-unplugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.926 274655 WARNING nova.compute.manager [req-83b737e8-8fb0-417f-bbbd-5edf2e698d07 req-60080c80-1163-4a51-9934-599a4e0d4ad6 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-unplugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state active and task_state powering-off.#033[00m Feb 1 04:36:19 localhost podman[275292]: 2026-02-01 09:36:19.952591784 +0000 UTC m=+0.117068218 container remove d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.956 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[b7547ee1-7d30-4c8f-8f39-686927355546]: (4, ('Sun Feb 1 09:36:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a (d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16)\nd9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16\nSun Feb 1 09:36:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a (d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16)\nd9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.957 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[73c11b6a-3bca-4f49-bfa3-2bb82cc01892]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.959 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bdf8183-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.960 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost kernel: device tap8bdf8183-80 left promiscuous mode Feb 1 04:36:19 localhost nova_compute[274651]: 2026-02-01 09:36:19.968 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.971 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[1256eb1d-e6e3-4326-b0e4-5b4aa3aaae6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:19 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.986 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe330e5-7bde-47ea-a23d-590357c84d44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.988 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[60606397-405d-459c-b28d-6692b2515209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:19.999 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[01af029f-04b4-41cd-94fb-5cd6cc25bcb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 635597, 'reachable_time': 40653, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275328, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.004 158757 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.005 158757 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7596c6-6a2b-4fc9-b66c-60b47e2da617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.005 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 09cac1be-46e2-4a31-8306-e6f4f0401b19 in datapath 8bdf8183-8467-40ac-933d-a37b0bd3539a unbound from our chassis#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.006 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6aaab6a9-3538-4fc9-b08e-a42b74cabd90 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.007 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8bdf8183-8467-40ac-933d-a37b0bd3539a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.007 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[544a0aea-e9f5-454d-8eee-8961b761eade]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.008 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 09cac1be-46e2-4a31-8306-e6f4f0401b19 in datapath 8bdf8183-8467-40ac-933d-a37b0bd3539a unbound from our chassis#033[00m Feb 1 04:36:20 localhost nova_compute[274651]: 2026-02-01 09:36:20.008 274655 INFO nova.virt.libvirt.driver [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Instance shutdown successfully after 3 seconds.#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.008 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6aaab6a9-3538-4fc9-b08e-a42b74cabd90 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.009 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8bdf8183-8467-40ac-933d-a37b0bd3539a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:36:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:20.009 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[0faf59e3-a2c7-4a1e-a04f-655def478306]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:20 localhost nova_compute[274651]: 2026-02-01 09:36:20.012 274655 INFO nova.virt.libvirt.driver [-] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Instance destroyed successfully.#033[00m Feb 1 04:36:20 localhost nova_compute[274651]: 2026-02-01 09:36:20.012 274655 DEBUG nova.objects.instance [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'numa_topology' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:20 localhost nova_compute[274651]: 2026-02-01 09:36:20.037 274655 DEBUG nova.compute.manager [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:20 localhost nova_compute[274651]: 2026-02-01 09:36:20.128 274655 DEBUG oslo_concurrency.lockutils [None req-8e42a990-e04f-4fe5-863b-7e084a86e641 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:20 localhost nova_compute[274651]: 2026-02-01 09:36:20.403 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:20 localhost systemd[1]: var-lib-containers-storage-overlay-33b86199362622eac4e2ad21e2d60cd14ef6b064ed2d582ad18255c0f15817dc-merged.mount: Deactivated successfully. Feb 1 04:36:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9e3a1d98668c88eab11a1d762e6f0298fbe59e4376c4c1d8880b98eb7acac16-userdata-shm.mount: Deactivated successfully. Feb 1 04:36:20 localhost systemd[1]: run-netns-ovnmeta\x2d8bdf8183\x2d8467\x2d40ac\x2d933d\x2da37b0bd3539a.mount: Deactivated successfully. Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.974 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.975 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.975 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.976 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.976 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.976 274655 WARNING nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state stopped and task_state None.#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.976 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.977 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.977 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.977 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.978 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.978 274655 WARNING nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state stopped and task_state None.#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.978 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.979 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.979 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.979 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.980 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.980 274655 WARNING nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state stopped and task_state None.#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.980 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-unplugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.980 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.981 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.981 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.981 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-unplugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.982 274655 WARNING nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-unplugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state stopped and task_state None.#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.982 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.982 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.982 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.983 274655 DEBUG oslo_concurrency.lockutils [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.983 274655 DEBUG nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:21 localhost nova_compute[274651]: 2026-02-01 09:36:21.983 274655 WARNING nova.compute.manager [req-26be2308-04d5-4e47-9ded-2a633592d6b7 req-0d23d37f-553a-4bfa-9838-953efadd30a0 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state stopped and task_state None.#033[00m Feb 1 04:36:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:36:22 localhost podman[275330]: 2026-02-01 09:36:22.731270778 +0000 UTC m=+0.091994800 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.736 274655 DEBUG nova.compute.manager [None req-77a430be-ef55-4401-b3b6-1adc58180bc2 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server [None req-77a430be-ef55-4401-b3b6-1adc58180bc2 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server raise self.value Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server raise self.value Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 1 04:36:22 localhost nova_compute[274651]: 2026-02-01 09:36:22.766 274655 ERROR oslo_messaging.rpc.server #033[00m Feb 1 04:36:22 localhost podman[275330]: 2026-02-01 09:36:22.772491008 +0000 UTC m=+0.133215080 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:36:22 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:36:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60696 DF PROTO=TCP SPT=52078 DPT=9102 SEQ=39011501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D087DB80000000001030307) Feb 1 04:36:23 localhost podman[236886]: time="2026-02-01T09:36:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:36:23 localhost podman[236886]: @ - - [01/Feb/2026:09:36:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148627 "" "Go-http-client/1.1" Feb 1 04:36:24 localhost podman[236886]: @ - - [01/Feb/2026:09:36:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16803 "" "Go-http-client/1.1" Feb 1 04:36:24 localhost nova_compute[274651]: 2026-02-01 09:36:24.946 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:25 localhost nova_compute[274651]: 2026-02-01 09:36:25.405 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:29 localhost nova_compute[274651]: 2026-02-01 09:36:29.976 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:30 localhost nova_compute[274651]: 2026-02-01 09:36:30.406 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:31 localhost openstack_network_exporter[239441]: ERROR 09:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:36:31 localhost openstack_network_exporter[239441]: Feb 1 04:36:31 localhost openstack_network_exporter[239441]: ERROR 09:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:36:31 localhost openstack_network_exporter[239441]: Feb 1 04:36:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:36:33 localhost systemd[1]: tmp-crun.KtvZmT.mount: Deactivated successfully. Feb 1 04:36:33 localhost podman[275349]: 2026-02-01 09:36:33.735451062 +0000 UTC m=+0.095085286 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:36:33 localhost podman[275349]: 2026-02-01 09:36:33.770530064 +0000 UTC m=+0.130164358 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:36:33 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.341 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.342 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.342 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.343 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.736 274655 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.738 274655 INFO nova.compute.manager [-] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] VM Stopped (Lifecycle Event)#033[00m Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.760 274655 DEBUG nova.compute.manager [None req-22629ba7-440f-4559-af0f-752e4468cf6a - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:34 localhost nova_compute[274651]: 2026-02-01 09:36:34.765 274655 DEBUG nova.compute.manager [None req-22629ba7-440f-4559-af0f-752e4468cf6a - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 1 04:36:35 localhost nova_compute[274651]: 2026-02-01 09:36:35.016 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:35 localhost nova_compute[274651]: 2026-02-01 09:36:35.408 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:35 localhost nova_compute[274651]: 2026-02-01 09:36:35.760 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:36:35 localhost nova_compute[274651]: 2026-02-01 09:36:35.761 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:36:35 localhost nova_compute[274651]: 2026-02-01 09:36:35.761 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:36:35 localhost nova_compute[274651]: 2026-02-01 09:36:35.762 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.170 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.197 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.197 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.198 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.199 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.199 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.200 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.201 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.201 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.202 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.202 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.219 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.220 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.220 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.220 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.220 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.674 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.743 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.744 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.939 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.941 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12345MB free_disk=41.8370475769043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.941 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:36 localhost nova_compute[274651]: 2026-02-01 09:36:36.941 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.020 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.020 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.021 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.075 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.568 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.575 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.611 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:36:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1680 DF PROTO=TCP SPT=44666 DPT=9102 SEQ=3083050363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D08B7020000000001030307) Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.637 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:36:37 localhost nova_compute[274651]: 2026-02-01 09:36:37.638 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1681 DF PROTO=TCP SPT=44666 DPT=9102 SEQ=3083050363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D08BAF80000000001030307) Feb 1 04:36:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60697 DF PROTO=TCP SPT=52078 DPT=9102 SEQ=39011501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D08BDB80000000001030307) Feb 1 04:36:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:36:39 localhost podman[275502]: 2026-02-01 09:36:39.709102713 +0000 UTC m=+0.063391979 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0) Feb 1 04:36:39 localhost podman[275502]: 2026-02-01 09:36:39.714452936 +0000 UTC m=+0.068742272 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent) Feb 1 04:36:39 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:36:40 localhost nova_compute[274651]: 2026-02-01 09:36:40.058 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:40 localhost nova_compute[274651]: 2026-02-01 09:36:40.411 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1682 DF PROTO=TCP SPT=44666 DPT=9102 SEQ=3083050363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D08C2F80000000001030307) Feb 1 04:36:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:36:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:36:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:41.698 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:41.698 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:41.699 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:41 localhost podman[275521]: 2026-02-01 09:36:41.739209859 +0000 UTC m=+0.091051463 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:36:41 localhost podman[275521]: 2026-02-01 09:36:41.74842538 +0000 UTC m=+0.100266974 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:36:41 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:36:41 localhost podman[275522]: 2026-02-01 09:36:41.717762833 +0000 UTC m=+0.070346690 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127) Feb 1 04:36:41 localhost podman[275522]: 2026-02-01 09:36:41.79948002 +0000 UTC m=+0.152063857 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:36:41 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:36:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45226 DF PROTO=TCP SPT=34540 DPT=9102 SEQ=439331414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D08C7B80000000001030307) Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.034 274655 DEBUG nova.compute.manager [None req-26b0ecb7-28fd-466a-9e6d-b05db39f06d9 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server [None req-26b0ecb7-28fd-466a-9e6d-b05db39f06d9 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server raise self.value Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server self.force_reraise() Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server raise self.value Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Feb 1 04:36:42 localhost nova_compute[274651]: 2026-02-01 09:36:42.070 274655 ERROR oslo_messaging.rpc.server #033[00m Feb 1 04:36:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1683 DF PROTO=TCP SPT=44666 DPT=9102 SEQ=3083050363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D08D2B90000000001030307) Feb 1 04:36:45 localhost nova_compute[274651]: 2026-02-01 09:36:45.100 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:45 localhost nova_compute[274651]: 2026-02-01 09:36:45.412 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:47 localhost nova_compute[274651]: 2026-02-01 09:36:47.814 274655 DEBUG nova.objects.instance [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'flavor' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:47 localhost nova_compute[274651]: 2026-02-01 09:36:47.844 274655 DEBUG oslo_concurrency.lockutils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:36:47 localhost nova_compute[274651]: 2026-02-01 09:36:47.845 274655 DEBUG oslo_concurrency.lockutils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:36:47 localhost nova_compute[274651]: 2026-02-01 09:36:47.845 274655 DEBUG nova.network.neutron [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Feb 1 04:36:47 localhost nova_compute[274651]: 2026-02-01 09:36:47.846 274655 DEBUG nova.objects.instance [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:48 localhost nova_compute[274651]: 2026-02-01 09:36:48.991 274655 DEBUG nova.network.neutron [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.034 274655 DEBUG oslo_concurrency.lockutils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.065 274655 INFO nova.virt.libvirt.driver [-] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Instance destroyed successfully.#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.066 274655 DEBUG nova.objects.instance [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'numa_topology' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.085 274655 DEBUG nova.objects.instance [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'resources' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.105 274655 DEBUG nova.virt.libvirt.vif [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-01T08:24:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005604212.localdomain',hostname='test',id=2,image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-01T08:24:22Z,launched_on='np0005604212.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005604212.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='79df39cba1c14309b68e8b61518619fd',ramdisk_id='',reservation_id='r-pgkx81ko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2026-02-01T09:36:20Z,user_data=None,user_id='7567a560936c417c92d242d856b00bb3',uuid=08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.105 274655 DEBUG nova.network.os_vif_util [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Converting VIF {"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.107 274655 DEBUG nova.network.os_vif_util [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.107 274655 DEBUG os_vif [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.111 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.112 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09cac1be-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.157 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.159 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.163 274655 INFO os_vif [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46')#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.167 274655 DEBUG nova.virt.libvirt.host [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.167 274655 INFO nova.virt.libvirt.host [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] UEFI support detected#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.176 274655 DEBUG nova.virt.libvirt.driver [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Start _get_guest_xml network_info=[{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=9ad21908-e58f-4439-b6a2-d7c4bf075554,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'image_id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}], 'ephemerals': [{'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'encryption_options': None, 'device_name': '/dev/vdb', 'size': 1, 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.182 274655 WARNING nova.virt.libvirt.driver [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.185 274655 DEBUG nova.virt.libvirt.host [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Searching host: 'np0005604212.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.186 274655 DEBUG nova.virt.libvirt.host [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.187 274655 DEBUG nova.virt.libvirt.host [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Searching host: 'np0005604212.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.188 274655 DEBUG nova.virt.libvirt.host [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.189 274655 DEBUG nova.virt.libvirt.driver [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.190 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-01T08:23:15Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='371ff7cc-43c7-4354-b1ce-55c23740c8c8',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=9ad21908-e58f-4439-b6a2-d7c4bf075554,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.190 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.191 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.191 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.192 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.192 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.193 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.193 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.193 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.194 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.194 274655 DEBUG nova.virt.hardware [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.195 274655 DEBUG nova.objects.instance [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'vcpu_model' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.216 274655 DEBUG nova.privsep.utils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.216 274655 DEBUG oslo_concurrency.processutils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.692 274655 DEBUG oslo_concurrency.processutils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:36:49 localhost nova_compute[274651]: 2026-02-01 09:36:49.695 274655 DEBUG oslo_concurrency.processutils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:36:49 localhost ovn_controller[152492]: 2026-02-01T09:36:49Z|00063|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.102 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.125 274655 DEBUG oslo_concurrency.processutils [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.126 274655 DEBUG nova.virt.libvirt.vif [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-01T08:24:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005604212.localdomain',hostname='test',id=2,image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-01T08:24:22Z,launched_on='np0005604212.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005604212.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='79df39cba1c14309b68e8b61518619fd',ramdisk_id='',reservation_id='r-pgkx81ko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2026-02-01T09:36:20Z,user_data=None,user_id='7567a560936c417c92d242d856b00bb3',uuid=08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.127 274655 DEBUG nova.network.os_vif_util [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Converting VIF {"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.128 274655 DEBUG nova.network.os_vif_util [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.130 274655 DEBUG nova.objects.instance [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Lazy-loading 'pci_devices' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.148 274655 DEBUG nova.virt.libvirt.driver [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] End _get_guest_xml xml= Feb 1 04:36:50 localhost nova_compute[274651]: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 Feb 1 04:36:50 localhost nova_compute[274651]: instance-00000002 Feb 1 04:36:50 localhost nova_compute[274651]: 524288 Feb 1 04:36:50 localhost nova_compute[274651]: 1 Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: test Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:49 Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: 512 Feb 1 04:36:50 localhost nova_compute[274651]: 1 Feb 1 04:36:50 localhost nova_compute[274651]: 0 Feb 1 04:36:50 localhost nova_compute[274651]: 1 Feb 1 04:36:50 localhost nova_compute[274651]: 1 Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: admin Feb 1 04:36:50 localhost nova_compute[274651]: admin Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: RDO Feb 1 04:36:50 localhost nova_compute[274651]: OpenStack Compute Feb 1 04:36:50 localhost nova_compute[274651]: 27.5.2-0.20260127144738.eaa65f0.el9 Feb 1 04:36:50 localhost nova_compute[274651]: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 Feb 1 04:36:50 localhost nova_compute[274651]: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 Feb 1 04:36:50 localhost nova_compute[274651]: Virtual Machine Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: hvm Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: /dev/urandom Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: Feb 1 04:36:50 localhost nova_compute[274651]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.149 274655 DEBUG nova.virt.libvirt.driver [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.150 274655 DEBUG nova.virt.libvirt.driver [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.151 274655 DEBUG nova.virt.libvirt.vif [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-01T08:24:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005604212.localdomain',hostname='test',id=2,image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-01T08:24:22Z,launched_on='np0005604212.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005604212.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='79df39cba1c14309b68e8b61518619fd',ramdisk_id='',reservation_id='r-pgkx81ko',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='9ad21908-e58f-4439-b6a2-d7c4bf075554',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2026-02-01T09:36:20Z,user_data=None,user_id='7567a560936c417c92d242d856b00bb3',uuid=08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.151 274655 DEBUG nova.network.os_vif_util [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Converting VIF {"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.152 274655 DEBUG nova.network.os_vif_util [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.152 274655 DEBUG os_vif [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.153 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.153 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.153 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.158 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.158 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09cac1be-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.158 274655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap09cac1be-46, col_values=(('external_ids', {'iface-id': '09cac1be-46e2-4a31-8306-e6f4f0401b19', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:11:63', 'vm-uuid': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.160 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.162 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.167 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.169 274655 INFO os_vif [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:11:63,bridge_name='br-int',has_traffic_filtering=True,id=09cac1be-46e2-4a31-8306-e6f4f0401b19,network=Network(8bdf8183-8467-40ac-933d-a37b0bd3539a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap09cac1be-46')#033[00m Feb 1 04:36:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:36:50 localhost systemd[1]: Started libvirt secret daemon. Feb 1 04:36:50 localhost kernel: device tap09cac1be-46 entered promiscuous mode Feb 1 04:36:50 localhost ovn_controller[152492]: 2026-02-01T09:36:50Z|00064|binding|INFO|Claiming lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 for this chassis. Feb 1 04:36:50 localhost ovn_controller[152492]: 2026-02-01T09:36:50Z|00065|binding|INFO|09cac1be-46e2-4a31-8306-e6f4f0401b19: Claiming fa:16:3e:86:11:63 192.168.0.12 Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.297 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost NetworkManager[5964]: [1769938610.3013] manager: (tap09cac1be-46): new Tun device (/org/freedesktop/NetworkManager/Devices/17) Feb 1 04:36:50 localhost ovn_controller[152492]: 2026-02-01T09:36:50Z|00066|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 ovn-installed in OVS Feb 1 04:36:50 localhost ovn_controller[152492]: 2026-02-01T09:36:50Z|00067|binding|INFO|Setting lport 09cac1be-46e2-4a31-8306-e6f4f0401b19 up in Southbound Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.304 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost systemd-udevd[275657]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:36:50 localhost systemd[1]: tmp-crun.a9flWr.mount: Deactivated successfully. Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.312 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:11:63 192.168.0.12'], port_security=['fa:16:3e:86:11:63 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '79df39cba1c14309b68e8b61518619fd', 'neutron:revision_number': '10', 'neutron:security_group_ids': '0b065334-69c4-4862-ab2c-0676d50a1918 0dc57611-620a-4a91-b761-dd2b6dc1d570', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5329260b-b0db-417b-bda6-9045427ce15d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=09cac1be-46e2-4a31-8306-e6f4f0401b19) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.314 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 09cac1be-46e2-4a31-8306-e6f4f0401b19 in datapath 8bdf8183-8467-40ac-933d-a37b0bd3539a bound to our chassis#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.316 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6aaab6a9-3538-4fc9-b08e-a42b74cabd90 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.316 158365 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8bdf8183-8467-40ac-933d-a37b0bd3539a#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.329 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[b7336c7e-ad8d-4f73-8125-b54442ea2ab4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.330 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8bdf8183-81 in ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.333 158526 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8bdf8183-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.333 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[15dbb2f5-443d-4ab6-86d9-beb5e18de71e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost podman[275614]: 2026-02-01 09:36:50.334404976 +0000 UTC m=+0.102308137 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, release=1769056855, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.7, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.337 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[53d857db-f709-46a4-8513-46228a4f8323]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost NetworkManager[5964]: [1769938610.3409] device (tap09cac1be-46): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 1 04:36:50 localhost NetworkManager[5964]: [1769938610.3414] device (tap09cac1be-46): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.348 158757 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4f24d3-f34c-46b8-9fa0-125306d51fa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.352 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.363 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[8117877b-0a3c-4d24-901f-10c04c70d535]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost systemd-machined[83507]: New machine qemu-2-instance-00000002. Feb 1 04:36:50 localhost podman[275614]: 2026-02-01 09:36:50.374075458 +0000 UTC m=+0.141978619 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:36:50 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Feb 1 04:36:50 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.391 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[6f887aa1-09d4-4b11-9dc1-2ba36529dfcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost NetworkManager[5964]: [1769938610.3979] manager: (tap8bdf8183-80): new Veth device (/org/freedesktop/NetworkManager/Devices/18) Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.397 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0153c4-dc59-471d-8afe-ef442d1e6b46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost systemd-udevd[275661]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.423 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[cd130207-d014-4a7d-8da8-2d1d038a5614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.426 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[a554f5e5-0d54-4d75-bda9-722bd79f131c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8bdf8183-81: link becomes ready Feb 1 04:36:50 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap8bdf8183-80: link becomes ready Feb 1 04:36:50 localhost NetworkManager[5964]: [1769938610.4462] device (tap8bdf8183-80): carrier: link connected Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.453 158695 DEBUG oslo.privsep.daemon [-] privsep: reply[3058a36a-72f9-433f-8a6f-6b0b70951d84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.472 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[efdd737a-2acb-4f8b-bb29-5d14f24cb36f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8bdf8183-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:29:7a:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1070459, 'reachable_time': 43309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 275698, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.488 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[97edd99f-da35-4cc7-a586-90327c97a4a8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:7aac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1070459, 'tstamp': 1070459}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 275700, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.504 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f9f6ce-9845-4f0e-8136-44ad9883c850]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8bdf8183-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:29:7a:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1070459, 'reachable_time': 43309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 275708, 'error': None, 'target': 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.507 274655 DEBUG nova.compute.manager [req-c351016d-64eb-4425-8241-5c4b7b5c0224 req-0f52398a-06d2-4f9a-b369-d76fd3bb53ea 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.508 274655 DEBUG oslo_concurrency.lockutils [req-c351016d-64eb-4425-8241-5c4b7b5c0224 req-0f52398a-06d2-4f9a-b369-d76fd3bb53ea 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.508 274655 DEBUG oslo_concurrency.lockutils [req-c351016d-64eb-4425-8241-5c4b7b5c0224 req-0f52398a-06d2-4f9a-b369-d76fd3bb53ea 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.509 274655 DEBUG oslo_concurrency.lockutils [req-c351016d-64eb-4425-8241-5c4b7b5c0224 req-0f52398a-06d2-4f9a-b369-d76fd3bb53ea 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.509 274655 DEBUG nova.compute.manager [req-c351016d-64eb-4425-8241-5c4b7b5c0224 req-0f52398a-06d2-4f9a-b369-d76fd3bb53ea 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.510 274655 WARNING nova.compute.manager [req-c351016d-64eb-4425-8241-5c4b7b5c0224 req-0f52398a-06d2-4f9a-b369-d76fd3bb53ea 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state stopped and task_state powering-on.#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.541 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ebff3e30-4cd0-46d7-a91a-62fb6fa887ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.616 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[cd394ef5-fdaf-406e-8890-f5536af02295]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.618 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bdf8183-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.619 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.619 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bdf8183-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:50 localhost kernel: device tap8bdf8183-80 entered promiscuous mode Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.623 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.628 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8bdf8183-80, col_values=(('external_ids', {'iface-id': 'a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:36:50 localhost ovn_controller[152492]: 2026-02-01T09:36:50Z|00068|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.629 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.639 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.640 158365 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.641 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[e3933cee-2318-48f0-a397-a29813e63efc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.642 158365 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: global Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: log /dev/log local0 debug Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: log-tag haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: user root Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: group root Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: maxconn 1024 Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: pidfile /var/lib/neutron/external/pids/8bdf8183-8467-40ac-933d-a37b0bd3539a.pid.haproxy Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: daemon Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: defaults Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: log global Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: mode http Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: option httplog Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: option dontlognull Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: option http-server-close Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: option forwardfor Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: retries 3 Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: timeout http-request 30s Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: timeout connect 30s Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: timeout client 32s Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: timeout server 32s Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: timeout http-keep-alive 30s Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: listen listener Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: bind 169.254.169.254:80 Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: server metadata /var/lib/neutron/metadata_proxy Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: http-request add-header X-OVN-Network-ID 8bdf8183-8467-40ac-933d-a37b0bd3539a Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Feb 1 04:36:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:36:50.643 158365 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'env', 'PROCESS_TAG=haproxy-8bdf8183-8467-40ac-933d-a37b0bd3539a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8bdf8183-8467-40ac-933d-a37b0bd3539a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.761 274655 DEBUG nova.virt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.762 274655 INFO nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] VM Resumed (Lifecycle Event)#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.764 274655 DEBUG nova.compute.manager [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.768 274655 INFO nova.virt.libvirt.driver [-] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Instance rebooted successfully.#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.769 274655 DEBUG nova.compute.manager [None req-06f082be-321b-4990-bde4-49bcd7c58f1b 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.814 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.819 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.871 274655 DEBUG nova.virt.driver [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.872 274655 INFO nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] VM Started (Lifecycle Event)#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.899 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:36:50 localhost nova_compute[274651]: 2026-02-01 09:36:50.903 274655 DEBUG nova.compute.manager [None req-8fffa1cc-198a-4a2c-967a-646f6da8bf1c - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Feb 1 04:36:51 localhost podman[275776]: Feb 1 04:36:51 localhost podman[275776]: 2026-02-01 09:36:51.095398535 +0000 UTC m=+0.099761478 container create 2a35a32fc7d41f11e754cb2851b560448fe1f823bc6d43bc84e2d1878ae81f20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:36:51 localhost systemd[1]: Started libpod-conmon-2a35a32fc7d41f11e754cb2851b560448fe1f823bc6d43bc84e2d1878ae81f20.scope. Feb 1 04:36:51 localhost podman[275776]: 2026-02-01 09:36:51.046146411 +0000 UTC m=+0.050509344 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 1 04:36:51 localhost systemd[1]: Started libcrun container. Feb 1 04:36:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecabe2508f60874c7cc5c69c5ab91476bc85806073ecc720d2bd700bbec3f88e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:36:51 localhost podman[275776]: 2026-02-01 09:36:51.168412327 +0000 UTC m=+0.172775270 container init 2a35a32fc7d41f11e754cb2851b560448fe1f823bc6d43bc84e2d1878ae81f20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:36:51 localhost podman[275776]: 2026-02-01 09:36:51.176022629 +0000 UTC m=+0.180385562 container start 2a35a32fc7d41f11e754cb2851b560448fe1f823bc6d43bc84e2d1878ae81f20 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:36:51 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[275790]: [NOTICE] (275794) : New worker (275796) forked Feb 1 04:36:51 localhost neutron-haproxy-ovnmeta-8bdf8183-8467-40ac-933d-a37b0bd3539a[275790]: [NOTICE] (275794) : Loading success. Feb 1 04:36:52 localhost nova_compute[274651]: 2026-02-01 09:36:52.563 274655 DEBUG nova.compute.manager [req-71476c0a-645c-4437-9a48-de3b528f6da1 req-a5649616-a5d8-4c60-8ce9-f301a3bb9cfe 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Feb 1 04:36:52 localhost nova_compute[274651]: 2026-02-01 09:36:52.565 274655 DEBUG oslo_concurrency.lockutils [req-71476c0a-645c-4437-9a48-de3b528f6da1 req-a5649616-a5d8-4c60-8ce9-f301a3bb9cfe 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:36:52 localhost nova_compute[274651]: 2026-02-01 09:36:52.565 274655 DEBUG oslo_concurrency.lockutils [req-71476c0a-645c-4437-9a48-de3b528f6da1 req-a5649616-a5d8-4c60-8ce9-f301a3bb9cfe 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:36:52 localhost nova_compute[274651]: 2026-02-01 09:36:52.566 274655 DEBUG oslo_concurrency.lockutils [req-71476c0a-645c-4437-9a48-de3b528f6da1 req-a5649616-a5d8-4c60-8ce9-f301a3bb9cfe 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:36:52 localhost nova_compute[274651]: 2026-02-01 09:36:52.567 274655 DEBUG nova.compute.manager [req-71476c0a-645c-4437-9a48-de3b528f6da1 req-a5649616-a5d8-4c60-8ce9-f301a3bb9cfe 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] No waiting events found dispatching network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Feb 1 04:36:52 localhost nova_compute[274651]: 2026-02-01 09:36:52.567 274655 WARNING nova.compute.manager [req-71476c0a-645c-4437-9a48-de3b528f6da1 req-a5649616-a5d8-4c60-8ce9-f301a3bb9cfe 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Received unexpected event network-vif-plugged-09cac1be-46e2-4a31-8306-e6f4f0401b19 for instance with vm_state active and task_state None.#033[00m Feb 1 04:36:52 localhost snmpd[66800]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Feb 1 04:36:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1684 DF PROTO=TCP SPT=44666 DPT=9102 SEQ=3083050363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D08F3B80000000001030307) Feb 1 04:36:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:36:53 localhost podman[275805]: 2026-02-01 09:36:53.72807403 +0000 UTC m=+0.079956533 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:36:53 localhost podman[275805]: 2026-02-01 09:36:53.73492447 +0000 UTC m=+0.086806993 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 04:36:53 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:36:53 localhost podman[236886]: time="2026-02-01T09:36:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:36:53 localhost podman[236886]: @ - - [01/Feb/2026:09:36:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:36:54 localhost podman[236886]: @ - - [01/Feb/2026:09:36:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17281 "" "Go-http-client/1.1" Feb 1 04:36:55 localhost nova_compute[274651]: 2026-02-01 09:36:55.164 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:00 localhost nova_compute[274651]: 2026-02-01 09:37:00.167 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:01 localhost openstack_network_exporter[239441]: ERROR 09:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:37:01 localhost openstack_network_exporter[239441]: Feb 1 04:37:01 localhost openstack_network_exporter[239441]: ERROR 09:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:37:01 localhost openstack_network_exporter[239441]: Feb 1 04:37:02 localhost ovn_controller[152492]: 2026-02-01T09:37:02Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:11:63 192.168.0.12 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.525 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.525 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.529 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67de3842-814e-448b-8b75-521b1d9a00bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.525901', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90cd76c4-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': 'a5690a1a1b245024b42bed661b5a552f6371638b8b094abdb08d6a83660703ae'}]}, 'timestamp': '2026-02-01 09:37:03.530663', '_unique_id': '6c1bd5a502ea4886a5f6a0dce2431850'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.532 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.533 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8508ffbb-c05c-4df9-9fc7-767fc40734b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.533807', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90ce08be-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '682e7c275e63af348d2278d53de6630dad66aa51f1c2a26f30fd9bef3935f5f6'}]}, 'timestamp': '2026-02-01 09:37:03.534351', '_unique_id': '9f432fd2458b4bbf8efe71a5d860206a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.535 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.536 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c124f19-e3e5-4039-b7f9-5560e65aa7a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.536669', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90ce765a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '1267f4bebbcacd5a822ee1b5c0803a263832c6fbb84e943be30ed59342d00cdd'}]}, 'timestamp': '2026-02-01 09:37:03.537186', '_unique_id': '6933dedca51648498062be6be1e0d1a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.569 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.570 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8992c8e2-5b54-40bc-b97b-22dbf6326722', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 21, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.539404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90d3858c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '75329dc5a85d01cfdda2590468ad02d6dc313d9287594d5f57f89d2f52238c7c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.539404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90d3986a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '760e560ca3568b09fdc908db6548014803c99f444dd8543109d40501e055a4ac'}]}, 'timestamp': '2026-02-01 09:37:03.570765', '_unique_id': 'e2b186ad1b784012b4f7715506869bae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.571 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.573 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.573 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.573 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 942 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63439765-2698-47eb-b20b-ef27159051e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 942, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.573443', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90d412a4-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '5dec0ad6fa56a721cfdb64c00e4c29bd123d1333fd8e3368511cb7f897d22af5'}]}, 'timestamp': '2026-02-01 09:37:03.573915', '_unique_id': '9c0bd9c2f56448cabcbb978a60cbbc50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.574 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.575 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.576 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78db0e1b-a964-47e9-818d-3325d2abc3a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.576151', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90d47c58-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': 'da7e4867343d87304b47d26af6af07837ccbbc54c22670e67fc4a836c7a0cdf4'}]}, 'timestamp': '2026-02-01 09:37:03.576623', '_unique_id': '5e6dd51f08b346a6b8668301fd13d9d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.578 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e52232fd-c143-4ca2-8714-bdf0db81a6e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.578774', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90d67c6a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.798223093, 'message_signature': 'cf2bbc8cfebaf768d62a1be34d14e91513466947a1165d1ef8a67e21a1b0de5a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.578774', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90d68e6c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.798223093, 'message_signature': 'd97b8dd1887ca2c87855d86d9ad04f2d7ea0ac00beb73c0b1156f0efbea29876'}]}, 'timestamp': '2026-02-01 09:37:03.590198', '_unique_id': 'c58519e3b81143ed82ee06961fd13525'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.593 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.593 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.593 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9d9480c-4edb-4e8c-bf74-38f7555e464a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.593487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90d72192-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.798223093, 'message_signature': 'e7f4b27f5728b2da7cb408910b5c5325785d8f4b6cbbd09b11d0478812a4fd8e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.593487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90d73394-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.798223093, 'message_signature': 'f66edd7c78a38129c98df1b6eda97be28c3b82789da790e8ccbbe5517a2f1257'}]}, 'timestamp': '2026-02-01 09:37:03.594389', '_unique_id': '8a004c63a86942b8b05874c16ab884ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.596 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 9460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8ec8433-4058-46c6-9f41-24f309930d40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9460000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:37:03.596608', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '90da63b6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.834112388, 'message_signature': '0f8bd29798ab6fe7a3b7a3b61181b986666e8dd751d7050f45046081ae4b4cf0'}]}, 'timestamp': '2026-02-01 09:37:03.615303', '_unique_id': 'f3ef43e6af4e40a0b747c63b5ef3c32f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.617 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 942 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e999669-31f3-4d76-9c25-48cf759ff5f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 942, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.617748', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90dad51c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '2e9850dbbf3ae5fa414b35c52b6b78d4a98eab7a57ccd876153f675e30df2ba4'}]}, 'timestamp': '2026-02-01 09:37:03.618245', '_unique_id': '16c7e8aaa9884923a29c3af5962979cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.620 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.620 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b365c165-389d-468a-bddd-a877da24554b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.620517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90db4128-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.798223093, 'message_signature': '9405bbf18eb77d649c8b603c2e36623a63bc8c728f30f28bda3dff65936c91a4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.620517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90db53de-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.798223093, 'message_signature': '6dc336d9d275c2c40cac314d7a61ca4326c5da745f416cd9f8effbc5b9d6aa79'}]}, 'timestamp': '2026-02-01 09:37:03.621431', '_unique_id': '5d1b9f3f2f4d41a280c08e55c4d8c6bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.623 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.623 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dafb161-becc-407e-b091-2fe739c7dc59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.623602', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90dbb9dc-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '3a9b76ae8e32982220ae0c53a25bf969913ebe0a167339fccdabe065b6c91566'}]}, 'timestamp': '2026-02-01 09:37:03.624124', '_unique_id': 'f06e997cb25f4bdf8268d3faa7cd9687'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 139264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2641b8ef-f30e-4021-930f-698fa664d594', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 139264, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.626251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90dc2228-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '128b6a3449c715f351c5c045f2d3384315aec01c285e278c4f3ccc4168a5f333'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.626251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90dc32b8-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '75390415d9c767ac5bfa26fb38939bbfc285c78607410a20010a283e31d8c0f1'}]}, 'timestamp': '2026-02-01 09:37:03.627164', '_unique_id': '6f807de502184182b035904350057d27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1076 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9463c71-37d0-4384-8803-e7cb82bc5bc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1076, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.629497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90dca004-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '7bbe8c417b95c590bbd107ea2a6f994e60e83a4127483883484b3292b304164c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.629497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90dcb1c0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': 'def06cbf3349db9ee7d87650d2d2346d333da5a998ebd92e4350292567ceed5d'}]}, 'timestamp': '2026-02-01 09:37:03.630386', '_unique_id': '7ca7f65bbd92420eb4976be5aa4a2809'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 29305856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.633 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e927aa23-b859-4807-a332-fb5ec0f052b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29305856, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.632603', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90dd193a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '965d3f91d7abdd6bcc4ecff4387a647e8133ca9b5d28bdc9b34c334005d8ddb2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.632603', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90dd2af6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '848578cc26ba4d5967cebf5d5aaa979c45bd1d78c69513dd7b02c91d378445d9'}]}, 'timestamp': '2026-02-01 09:37:03.633486', '_unique_id': 'f2abd3d076bf4d1aa8c5d00c77dfaef2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f77e837-7c04-4556-a946-3f29f362a627', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.635646', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90dd9068-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '014f8188236e7ca6736c80fa5a053934e432804ede4d4a3bc0f20889014d60a2'}]}, 'timestamp': '2026-02-01 09:37:03.636185', '_unique_id': 'b2a03dd8054d4d6d917012605c1198c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.637 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 48.83203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0635e23a-c4b3-4e1b-9081-c0c49bee5633', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 48.83203125, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:37:03.637841', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '90dde28e-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.834112388, 'message_signature': '6c141813e012ed68904292cbce357d32aac4fd39c41544cbd279fc04bc0dfff6'}]}, 'timestamp': '2026-02-01 09:37:03.638138', '_unique_id': '6e3d4d61aa9f42dfa155b2dad363f828'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 586727275 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2c97512-8ca5-4942-98c2-914ad7e96ff3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 586727275, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.639532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90de24a6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': 'acffcfc4188a40d9b8a03c49f7aa45bc3b278d5ee953bd98f6e71a6e6bed93d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.639532', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90de2ef6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': '56c7786a822c2273301a212abb43c0119f1279791281fae6ada81188b16eca43'}]}, 'timestamp': '2026-02-01 09:37:03.640105', '_unique_id': '7998750e380d4b8c89261559f0a9d563'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.641 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.641 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.641 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 970 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e875803-5f93-4133-8db4-574782643ea3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 970, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.641620', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90de764a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '4732f3a510bd421dfd2dc866c2365a7c2a16070f1353646e3b8391649c417b02'}]}, 'timestamp': '2026-02-01 09:37:03.641911', '_unique_id': 'c3051b582e2746ebb7967e4e92174e19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.643 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1225837263 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0ccdc2a-d7df-4d9b-a390-05c6e343463e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1225837263, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:37:03.643245', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '90deb5a6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': 'cc1326d9f857f985b53f2d744d84e37197ed532352f3df375bd8be4341049934'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:37:03.643245', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '90debfce-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.75885037, 'message_signature': 'afbff5bc7523e51c5cb271113ea5bba17bbc4c76587483dc362c0bbba07530cc'}]}, 'timestamp': '2026-02-01 09:37:03.643772', '_unique_id': 'fce4b52c5cac4bd4bac9cdbb29b3e5f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.645 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 970 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a1d9c5e-de10-4dce-98fa-42f9c6cd0e8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 970, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:37:03.645161', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '90df0092-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10717.745472381, 'message_signature': '15b4a41950743f66ef13d4f08f145312466e4a770f98f22ee9ff1d91da49251d'}]}, 'timestamp': '2026-02-01 09:37:03.645448', '_unique_id': '01dae04b3f1e441d8c5d99c3d470b3ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:37:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:37:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:37:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:37:04 localhost systemd[1]: tmp-crun.L5JKaO.mount: Deactivated successfully. Feb 1 04:37:04 localhost podman[275824]: 2026-02-01 09:37:04.734330018 +0000 UTC m=+0.092457806 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:37:04 localhost podman[275824]: 2026-02-01 09:37:04.772495524 +0000 UTC m=+0.130623312 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:37:04 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:37:05 localhost nova_compute[274651]: 2026-02-01 09:37:05.173 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:05 localhost nova_compute[274651]: 2026-02-01 09:37:05.175 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:05 localhost nova_compute[274651]: 2026-02-01 09:37:05.176 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:05 localhost nova_compute[274651]: 2026-02-01 09:37:05.176 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:05 localhost nova_compute[274651]: 2026-02-01 09:37:05.215 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:05 localhost nova_compute[274651]: 2026-02-01 09:37:05.216 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:07.319 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:07.321 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:07 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18957 DF PROTO=TCP SPT=41540 DPT=9102 SEQ=2811211714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D092C310000000001030307) Feb 1 04:37:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18958 DF PROTO=TCP SPT=41540 DPT=9102 SEQ=2811211714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0930380000000001030307) Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.776 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.777 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.4553344#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53576 [01/Feb/2026:09:37:07.318] listener listener/metadata 0/0/0/1458/1458 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.792 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.793 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53588 [01/Feb/2026:09:37:08.791] listener listener/metadata 0/0/0/26/26 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.818 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 0.0249164#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.834 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.835 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.850 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53604 [01/Feb/2026:09:37:08.833] listener listener/metadata 0/0/0/17/17 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.851 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.0158856#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.857 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.858 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.872 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53608 [01/Feb/2026:09:37:08.857] listener listener/metadata 0/0/0/15/15 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.873 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.0146718#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.880 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.880 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.903 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.904 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.0236855#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53612 [01/Feb/2026:09:37:08.879] listener listener/metadata 0/0/0/25/25 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.911 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.912 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.929 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53622 [01/Feb/2026:09:37:08.910] listener listener/metadata 0/0/0/18/18 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.929 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 148 time: 0.0171022#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.936 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.937 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.959 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53624 [01/Feb/2026:09:37:08.936] listener listener/metadata 0/0/0/24/24 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.960 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.0226822#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.967 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.968 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.988 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.988 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.0205131#033[00m Feb 1 04:37:08 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53630 [01/Feb/2026:09:37:08.966] listener listener/metadata 0/0/0/22/22 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.995 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:08.996 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:08 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost nova_compute[274651]: 2026-02-01 09:37:09.001 274655 DEBUG nova.compute.manager [None req-a6d89ec4-d735-4b6c-af1a-b40bcf106ee9 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 1 04:37:09 localhost nova_compute[274651]: 2026-02-01 09:37:09.006 274655 INFO nova.compute.manager [None req-a6d89ec4-d735-4b6c-af1a-b40bcf106ee9 7567a560936c417c92d242d856b00bb3 79df39cba1c14309b68e8b61518619fd - - default default] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Retrieving diagnostics#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.015 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53640 [01/Feb/2026:09:37:08.995] listener listener/metadata 0/0/0/20/20 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.015 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.0188339#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.023 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.024 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53656 [01/Feb/2026:09:37:09.022] listener listener/metadata 0/0/0/18/18 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.041 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.0164549#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.057 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.057 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.075 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53658 [01/Feb/2026:09:37:09.056] listener listener/metadata 0/0/0/19/19 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.076 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.0183599#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.081 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.081 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.095 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.096 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.0143023#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53666 [01/Feb/2026:09:37:09.080] listener listener/metadata 0/0/0/15/15 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.101 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.102 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.118 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.119 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.0168233#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53672 [01/Feb/2026:09:37:09.101] listener listener/metadata 0/0/0/17/17 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.125 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.126 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.139 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53682 [01/Feb/2026:09:37:09.125] listener listener/metadata 0/0/0/15/15 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.140 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.0144646#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.147 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.148 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.185 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.186 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.0381417#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53692 [01/Feb/2026:09:37:09.147] listener listener/metadata 0/0/0/39/39 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.194 158462 DEBUG eventlet.wsgi.server [-] (158462) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.194 158462 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Accept: */*#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Connection: close#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Content-Type: text/plain#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: Host: 169.254.169.254#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: User-Agent: curl/7.84.0#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Forwarded-For: 192.168.0.12#015 Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: X-Ovn-Network-Id: 8bdf8183-8467-40ac-933d-a37b0bd3539a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.245 158462 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Feb 1 04:37:09 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:09.245 158462 INFO eventlet.wsgi.server [-] 192.168.0.12, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.0507913#033[00m Feb 1 04:37:09 localhost haproxy-metadata-proxy-8bdf8183-8467-40ac-933d-a37b0bd3539a[275796]: 192.168.0.12:53694 [01/Feb/2026:09:37:09.193] listener listener/metadata 0/0/0/51/51 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Feb 1 04:37:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1685 DF PROTO=TCP SPT=44666 DPT=9102 SEQ=3083050363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0933B80000000001030307) Feb 1 04:37:10 localhost nova_compute[274651]: 2026-02-01 09:37:10.218 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:10 localhost nova_compute[274651]: 2026-02-01 09:37:10.220 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:10 localhost nova_compute[274651]: 2026-02-01 09:37:10.220 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:10 localhost nova_compute[274651]: 2026-02-01 09:37:10.220 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:10 localhost nova_compute[274651]: 2026-02-01 09:37:10.245 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:10 localhost nova_compute[274651]: 2026-02-01 09:37:10.246 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:37:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18959 DF PROTO=TCP SPT=41540 DPT=9102 SEQ=2811211714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09383D0000000001030307) Feb 1 04:37:10 localhost podman[275847]: 2026-02-01 09:37:10.713324992 +0000 UTC m=+0.074375833 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 1 04:37:10 localhost podman[275847]: 2026-02-01 09:37:10.74530422 +0000 UTC m=+0.106355131 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:37:10 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:37:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60698 DF PROTO=TCP SPT=52078 DPT=9102 SEQ=39011501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D093BB90000000001030307) Feb 1 04:37:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:37:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:37:12 localhost podman[275867]: 2026-02-01 09:37:12.7229083 +0000 UTC m=+0.080296955 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:37:12 localhost podman[275867]: 2026-02-01 09:37:12.761405276 +0000 UTC m=+0.118793931 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Feb 1 04:37:12 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:37:12 localhost podman[275866]: 2026-02-01 09:37:12.768814432 +0000 UTC m=+0.129088235 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:37:12 localhost podman[275866]: 2026-02-01 09:37:12.851459798 +0000 UTC m=+0.211733611 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:37:12 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:37:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18960 DF PROTO=TCP SPT=41540 DPT=9102 SEQ=2811211714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0947F80000000001030307) Feb 1 04:37:15 localhost nova_compute[274651]: 2026-02-01 09:37:15.247 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:15 localhost nova_compute[274651]: 2026-02-01 09:37:15.249 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:15 localhost nova_compute[274651]: 2026-02-01 09:37:15.249 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:15 localhost nova_compute[274651]: 2026-02-01 09:37:15.249 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:15 localhost nova_compute[274651]: 2026-02-01 09:37:15.286 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:15 localhost nova_compute[274651]: 2026-02-01 09:37:15.287 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:20 localhost nova_compute[274651]: 2026-02-01 09:37:20.289 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:20 localhost nova_compute[274651]: 2026-02-01 09:37:20.334 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:20 localhost nova_compute[274651]: 2026-02-01 09:37:20.334 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5047 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:20 localhost nova_compute[274651]: 2026-02-01 09:37:20.334 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:20 localhost nova_compute[274651]: 2026-02-01 09:37:20.336 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:20 localhost nova_compute[274651]: 2026-02-01 09:37:20.337 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:20 localhost ovn_controller[152492]: 2026-02-01T09:37:20Z|00069|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory Feb 1 04:37:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:37:20 localhost podman[275914]: 2026-02-01 09:37:20.699671561 +0000 UTC m=+0.063746249 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64) Feb 1 04:37:20 localhost podman[275914]: 2026-02-01 09:37:20.738530018 +0000 UTC m=+0.102604686 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7) Feb 1 04:37:20 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:37:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18961 DF PROTO=TCP SPT=41540 DPT=9102 SEQ=2811211714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0967B90000000001030307) Feb 1 04:37:23 localhost podman[236886]: time="2026-02-01T09:37:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:37:23 localhost podman[236886]: @ - - [01/Feb/2026:09:37:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:37:24 localhost podman[236886]: @ - - [01/Feb/2026:09:37:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17288 "" "Go-http-client/1.1" Feb 1 04:37:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:37:24 localhost podman[275935]: 2026-02-01 09:37:24.724071627 +0000 UTC m=+0.086852274 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Feb 1 04:37:24 localhost podman[275935]: 2026-02-01 09:37:24.737375903 +0000 UTC m=+0.100156590 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ceilometer_agent_compute) Feb 1 04:37:24 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:37:25 localhost nova_compute[274651]: 2026-02-01 09:37:25.337 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:25 localhost nova_compute[274651]: 2026-02-01 09:37:25.339 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:25 localhost nova_compute[274651]: 2026-02-01 09:37:25.340 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:25 localhost nova_compute[274651]: 2026-02-01 09:37:25.340 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:25 localhost nova_compute[274651]: 2026-02-01 09:37:25.367 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:25 localhost nova_compute[274651]: 2026-02-01 09:37:25.367 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:30 localhost nova_compute[274651]: 2026-02-01 09:37:30.368 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:30 localhost nova_compute[274651]: 2026-02-01 09:37:30.370 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:30 localhost nova_compute[274651]: 2026-02-01 09:37:30.370 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:30 localhost nova_compute[274651]: 2026-02-01 09:37:30.370 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:30 localhost nova_compute[274651]: 2026-02-01 09:37:30.371 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:30 localhost nova_compute[274651]: 2026-02-01 09:37:30.375 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:31 localhost openstack_network_exporter[239441]: ERROR 09:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:37:31 localhost openstack_network_exporter[239441]: Feb 1 04:37:31 localhost openstack_network_exporter[239441]: ERROR 09:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:37:31 localhost openstack_network_exporter[239441]: Feb 1 04:37:35 localhost nova_compute[274651]: 2026-02-01 09:37:35.377 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:35 localhost nova_compute[274651]: 2026-02-01 09:37:35.379 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:35 localhost nova_compute[274651]: 2026-02-01 09:37:35.379 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:35 localhost nova_compute[274651]: 2026-02-01 09:37:35.379 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:35 localhost nova_compute[274651]: 2026-02-01 09:37:35.406 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:35 localhost nova_compute[274651]: 2026-02-01 09:37:35.407 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:37:35 localhost podman[275954]: 2026-02-01 09:37:35.720565291 +0000 UTC m=+0.079389397 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:37:35 localhost podman[275954]: 2026-02-01 09:37:35.733403843 +0000 UTC m=+0.092227939 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:37:35 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:37:36 localhost snmpd[66800]: empty variable list in _query Feb 1 04:37:36 localhost snmpd[66800]: empty variable list in _query Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.561 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.563 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.589 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.589 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.590 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:37:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2160 DF PROTO=TCP SPT=43610 DPT=9102 SEQ=1966692501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09A1620000000001030307) Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.737 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.738 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.738 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:37:37 localhost nova_compute[274651]: 2026-02-01 09:37:37.739 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.265 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.281 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.282 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.282 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.283 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.283 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.284 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.284 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.285 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.285 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.286 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.301 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.302 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.302 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.303 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.303 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:37:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2161 DF PROTO=TCP SPT=43610 DPT=9102 SEQ=1966692501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09A5780000000001030307) Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.771 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.858 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:37:38 localhost nova_compute[274651]: 2026-02-01 09:37:38.858 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.097 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.099 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12064MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.099 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.099 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.185 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.186 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.186 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:37:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18962 DF PROTO=TCP SPT=41540 DPT=9102 SEQ=2811211714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09A7B80000000001030307) Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.249 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.684 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.692 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.711 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.738 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:37:39 localhost nova_compute[274651]: 2026-02-01 09:37:39.738 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:37:40 localhost nova_compute[274651]: 2026-02-01 09:37:40.408 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:40 localhost nova_compute[274651]: 2026-02-01 09:37:40.409 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:40 localhost nova_compute[274651]: 2026-02-01 09:37:40.409 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:40 localhost nova_compute[274651]: 2026-02-01 09:37:40.410 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:40 localhost nova_compute[274651]: 2026-02-01 09:37:40.452 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:40 localhost nova_compute[274651]: 2026-02-01 09:37:40.453 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2162 DF PROTO=TCP SPT=43610 DPT=9102 SEQ=1966692501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09AD780000000001030307) Feb 1 04:37:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:37:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:41.699 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:37:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:41.699 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:37:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:37:41.700 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:37:41 localhost podman[276107]: 2026-02-01 09:37:41.720750661 +0000 UTC m=+0.078785288 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:37:41 localhost podman[276107]: 2026-02-01 09:37:41.755441931 +0000 UTC m=+0.113476518 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 04:37:41 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:37:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1686 DF PROTO=TCP SPT=44666 DPT=9102 SEQ=3083050363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09B1B80000000001030307) Feb 1 04:37:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:37:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:37:43 localhost podman[276125]: 2026-02-01 09:37:43.731881027 +0000 UTC m=+0.086114283 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:37:43 localhost podman[276125]: 2026-02-01 09:37:43.770493907 +0000 UTC m=+0.124727163 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:37:43 localhost systemd[1]: tmp-crun.oRNhDW.mount: Deactivated successfully. Feb 1 04:37:43 localhost podman[276126]: 2026-02-01 09:37:43.788163326 +0000 UTC m=+0.138022347 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:37:43 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:37:43 localhost podman[276126]: 2026-02-01 09:37:43.888258875 +0000 UTC m=+0.238117896 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:37:43 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:37:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2163 DF PROTO=TCP SPT=43610 DPT=9102 SEQ=1966692501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09BD380000000001030307) Feb 1 04:37:45 localhost nova_compute[274651]: 2026-02-01 09:37:45.488 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:45 localhost nova_compute[274651]: 2026-02-01 09:37:45.490 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:45 localhost nova_compute[274651]: 2026-02-01 09:37:45.490 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:45 localhost nova_compute[274651]: 2026-02-01 09:37:45.490 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:45 localhost nova_compute[274651]: 2026-02-01 09:37:45.491 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:45 localhost nova_compute[274651]: 2026-02-01 09:37:45.493 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:50 localhost nova_compute[274651]: 2026-02-01 09:37:50.495 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:50 localhost nova_compute[274651]: 2026-02-01 09:37:50.497 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:50 localhost nova_compute[274651]: 2026-02-01 09:37:50.497 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:50 localhost nova_compute[274651]: 2026-02-01 09:37:50.498 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:50 localhost nova_compute[274651]: 2026-02-01 09:37:50.526 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:50 localhost nova_compute[274651]: 2026-02-01 09:37:50.527 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:37:51 localhost podman[276171]: 2026-02-01 09:37:51.723941934 +0000 UTC m=+0.085272857 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.7, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, architecture=x86_64, vcs-type=git, name=ubi9/ubi-minimal, managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 1 04:37:51 localhost podman[276171]: 2026-02-01 09:37:51.736731674 +0000 UTC m=+0.098062567 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 1 04:37:51 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:37:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2164 DF PROTO=TCP SPT=43610 DPT=9102 SEQ=1966692501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D09DDB90000000001030307) Feb 1 04:37:53 localhost podman[236886]: time="2026-02-01T09:37:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:37:53 localhost podman[236886]: @ - - [01/Feb/2026:09:37:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:37:54 localhost podman[236886]: @ - - [01/Feb/2026:09:37:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17282 "" "Go-http-client/1.1" Feb 1 04:37:55 localhost nova_compute[274651]: 2026-02-01 09:37:55.528 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:55 localhost nova_compute[274651]: 2026-02-01 09:37:55.530 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:37:55 localhost nova_compute[274651]: 2026-02-01 09:37:55.531 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:37:55 localhost nova_compute[274651]: 2026-02-01 09:37:55.531 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:55 localhost nova_compute[274651]: 2026-02-01 09:37:55.543 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:37:55 localhost nova_compute[274651]: 2026-02-01 09:37:55.544 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:37:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:37:55 localhost podman[276192]: 2026-02-01 09:37:55.716096425 +0000 UTC m=+0.080167961 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127) Feb 1 04:37:55 localhost podman[276192]: 2026-02-01 09:37:55.732428954 +0000 UTC m=+0.096500530 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 1 04:37:55 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:38:00 localhost nova_compute[274651]: 2026-02-01 09:38:00.545 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:00 localhost nova_compute[274651]: 2026-02-01 09:38:00.546 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:00 localhost nova_compute[274651]: 2026-02-01 09:38:00.546 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:00 localhost nova_compute[274651]: 2026-02-01 09:38:00.546 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:00 localhost nova_compute[274651]: 2026-02-01 09:38:00.547 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:00 localhost nova_compute[274651]: 2026-02-01 09:38:00.549 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:01 localhost openstack_network_exporter[239441]: ERROR 09:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:38:01 localhost openstack_network_exporter[239441]: Feb 1 04:38:01 localhost openstack_network_exporter[239441]: ERROR 09:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:38:01 localhost openstack_network_exporter[239441]: Feb 1 04:38:05 localhost nova_compute[274651]: 2026-02-01 09:38:05.550 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:05 localhost nova_compute[274651]: 2026-02-01 09:38:05.552 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:05 localhost nova_compute[274651]: 2026-02-01 09:38:05.552 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:05 localhost nova_compute[274651]: 2026-02-01 09:38:05.552 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:05 localhost nova_compute[274651]: 2026-02-01 09:38:05.581 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:05 localhost nova_compute[274651]: 2026-02-01 09:38:05.581 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:38:06 localhost podman[276211]: 2026-02-01 09:38:06.723455637 +0000 UTC m=+0.086319610 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:38:06 localhost podman[276211]: 2026-02-01 09:38:06.760769486 +0000 UTC m=+0.123633449 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:38:06 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:38:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49614 DF PROTO=TCP SPT=45052 DPT=9102 SEQ=105664569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A16920000000001030307) Feb 1 04:38:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49615 DF PROTO=TCP SPT=45052 DPT=9102 SEQ=105664569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A1AB80000000001030307) Feb 1 04:38:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2165 DF PROTO=TCP SPT=43610 DPT=9102 SEQ=1966692501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A1DB80000000001030307) Feb 1 04:38:10 localhost nova_compute[274651]: 2026-02-01 09:38:10.583 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:10 localhost nova_compute[274651]: 2026-02-01 09:38:10.585 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:10 localhost nova_compute[274651]: 2026-02-01 09:38:10.586 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:10 localhost nova_compute[274651]: 2026-02-01 09:38:10.586 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:10 localhost nova_compute[274651]: 2026-02-01 09:38:10.608 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:10 localhost nova_compute[274651]: 2026-02-01 09:38:10.609 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49616 DF PROTO=TCP SPT=45052 DPT=9102 SEQ=105664569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A22B80000000001030307) Feb 1 04:38:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18963 DF PROTO=TCP SPT=41540 DPT=9102 SEQ=2811211714 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A25B90000000001030307) Feb 1 04:38:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:38:12 localhost systemd[1]: tmp-crun.U3ryTD.mount: Deactivated successfully. Feb 1 04:38:12 localhost podman[276232]: 2026-02-01 09:38:12.699784077 +0000 UTC m=+0.069096353 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:38:12 localhost podman[276232]: 2026-02-01 09:38:12.735788947 +0000 UTC m=+0.105101243 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Feb 1 04:38:12 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:38:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:38:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:38:14 localhost systemd[1]: tmp-crun.63DcQT.mount: Deactivated successfully. Feb 1 04:38:14 localhost podman[276250]: 2026-02-01 09:38:14.742015688 +0000 UTC m=+0.099778973 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:38:14 localhost podman[276250]: 2026-02-01 09:38:14.75342922 +0000 UTC m=+0.111192475 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:38:14 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:38:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49617 DF PROTO=TCP SPT=45052 DPT=9102 SEQ=105664569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A32790000000001030307) Feb 1 04:38:14 localhost podman[276251]: 2026-02-01 09:38:14.830646987 +0000 UTC m=+0.185466704 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:38:14 localhost podman[276251]: 2026-02-01 09:38:14.894233614 +0000 UTC m=+0.249053311 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 04:38:14 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:38:15 localhost nova_compute[274651]: 2026-02-01 09:38:15.609 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:15 localhost nova_compute[274651]: 2026-02-01 09:38:15.611 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:15 localhost nova_compute[274651]: 2026-02-01 09:38:15.612 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:15 localhost nova_compute[274651]: 2026-02-01 09:38:15.612 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:15 localhost nova_compute[274651]: 2026-02-01 09:38:15.636 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:15 localhost nova_compute[274651]: 2026-02-01 09:38:15.637 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:18 localhost sshd[276298]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:38:18 localhost systemd-logind[759]: New session 61 of user zuul. Feb 1 04:38:18 localhost systemd[1]: Started Session 61 of User zuul. Feb 1 04:38:19 localhost python3[276320]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:38:19 localhost subscription-manager[276321]: Unregistered machine with identity: 7a89b532-8c8f-4d02-bd1e-9a3ae674e86b Feb 1 04:38:20 localhost nova_compute[274651]: 2026-02-01 09:38:20.638 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:20 localhost nova_compute[274651]: 2026-02-01 09:38:20.641 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:20 localhost nova_compute[274651]: 2026-02-01 09:38:20.641 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:20 localhost nova_compute[274651]: 2026-02-01 09:38:20.641 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:20 localhost nova_compute[274651]: 2026-02-01 09:38:20.668 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:20 localhost nova_compute[274651]: 2026-02-01 09:38:20.668 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:38:22 localhost podman[276323]: 2026-02-01 09:38:22.709334175 +0000 UTC m=+0.068378582 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, version=9.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 1 04:38:22 localhost podman[276323]: 2026-02-01 09:38:22.75450724 +0000 UTC m=+0.113551687 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, architecture=x86_64, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Feb 1 04:38:22 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:38:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49618 DF PROTO=TCP SPT=45052 DPT=9102 SEQ=105664569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A53B80000000001030307) Feb 1 04:38:23 localhost podman[236886]: time="2026-02-01T09:38:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:38:23 localhost podman[236886]: @ - - [01/Feb/2026:09:38:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:38:24 localhost podman[236886]: @ - - [01/Feb/2026:09:38:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17292 "" "Go-http-client/1.1" Feb 1 04:38:25 localhost nova_compute[274651]: 2026-02-01 09:38:25.670 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:25 localhost nova_compute[274651]: 2026-02-01 09:38:25.672 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:25 localhost nova_compute[274651]: 2026-02-01 09:38:25.672 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:25 localhost nova_compute[274651]: 2026-02-01 09:38:25.672 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:25 localhost nova_compute[274651]: 2026-02-01 09:38:25.698 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:25 localhost nova_compute[274651]: 2026-02-01 09:38:25.699 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:26 localhost sshd[276343]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:38:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:38:26 localhost podman[276345]: 2026-02-01 09:38:26.609255578 +0000 UTC m=+0.093148396 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:38:26 localhost podman[276345]: 2026-02-01 09:38:26.624468234 +0000 UTC m=+0.108361032 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:38:26 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:38:30 localhost nova_compute[274651]: 2026-02-01 09:38:30.700 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:30 localhost nova_compute[274651]: 2026-02-01 09:38:30.702 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:30 localhost nova_compute[274651]: 2026-02-01 09:38:30.702 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:30 localhost nova_compute[274651]: 2026-02-01 09:38:30.702 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:30 localhost nova_compute[274651]: 2026-02-01 09:38:30.733 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:30 localhost nova_compute[274651]: 2026-02-01 09:38:30.735 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:31 localhost openstack_network_exporter[239441]: ERROR 09:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:38:31 localhost openstack_network_exporter[239441]: Feb 1 04:38:31 localhost openstack_network_exporter[239441]: ERROR 09:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:38:31 localhost openstack_network_exporter[239441]: Feb 1 04:38:35 localhost nova_compute[274651]: 2026-02-01 09:38:35.736 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:35 localhost nova_compute[274651]: 2026-02-01 09:38:35.737 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:35 localhost nova_compute[274651]: 2026-02-01 09:38:35.738 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:35 localhost nova_compute[274651]: 2026-02-01 09:38:35.738 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:35 localhost nova_compute[274651]: 2026-02-01 09:38:35.755 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:35 localhost nova_compute[274651]: 2026-02-01 09:38:35.755 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:38:37 localhost podman[276365]: 2026-02-01 09:38:37.362040061 +0000 UTC m=+0.092421183 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:38:37 localhost podman[276365]: 2026-02-01 09:38:37.374416982 +0000 UTC m=+0.104798144 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:38:37 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:38:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44924 DF PROTO=TCP SPT=55452 DPT=9102 SEQ=3558353216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A8BC20000000001030307) Feb 1 04:38:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44925 DF PROTO=TCP SPT=55452 DPT=9102 SEQ=3558353216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A8FB80000000001030307) Feb 1 04:38:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49619 DF PROTO=TCP SPT=45052 DPT=9102 SEQ=105664569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A93B80000000001030307) Feb 1 04:38:39 localhost nova_compute[274651]: 2026-02-01 09:38:39.740 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:39 localhost nova_compute[274651]: 2026-02-01 09:38:39.741 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:39 localhost nova_compute[274651]: 2026-02-01 09:38:39.741 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:38:39 localhost nova_compute[274651]: 2026-02-01 09:38:39.742 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.019 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.020 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.021 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.021 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.542 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.558 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.558 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.559 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.560 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.560 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.560 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.561 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.561 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.562 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.562 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.591 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.591 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.592 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.592 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.593 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:38:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44926 DF PROTO=TCP SPT=55452 DPT=9102 SEQ=3558353216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A97B80000000001030307) Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.757 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.759 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.760 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.760 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.802 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:40 localhost nova_compute[274651]: 2026-02-01 09:38:40.803 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.065 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.126 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.127 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.344 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.345 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12064MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.346 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.346 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.406 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.406 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.406 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.456 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:38:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:38:41.700 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:38:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:38:41.700 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:38:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:38:41.702 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:38:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2166 DF PROTO=TCP SPT=43610 DPT=9102 SEQ=1966692501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0A9BB90000000001030307) Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.911 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.918 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.937 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.940 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:38:41 localhost nova_compute[274651]: 2026-02-01 09:38:41.940 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:38:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:38:43 localhost podman[276518]: 2026-02-01 09:38:43.727810391 +0000 UTC m=+0.081454654 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:38:43 localhost podman[276518]: 2026-02-01 09:38:43.736363427 +0000 UTC m=+0.090007690 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:38:43 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:38:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44927 DF PROTO=TCP SPT=55452 DPT=9102 SEQ=3558353216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0AA7780000000001030307) Feb 1 04:38:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:38:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:38:45 localhost systemd[1]: tmp-crun.uZ5OaD.mount: Deactivated successfully. Feb 1 04:38:45 localhost podman[276536]: 2026-02-01 09:38:45.740165158 +0000 UTC m=+0.094997190 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 1 04:38:45 localhost nova_compute[274651]: 2026-02-01 09:38:45.804 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:45 localhost podman[276535]: 2026-02-01 09:38:45.815053854 +0000 UTC m=+0.173036551 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:38:45 localhost podman[276535]: 2026-02-01 09:38:45.830491257 +0000 UTC m=+0.188474004 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:38:45 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:38:45 localhost podman[276536]: 2026-02-01 09:38:45.845470076 +0000 UTC m=+0.200302098 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:38:45 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:38:50 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Feb 1 04:38:50 localhost nova_compute[274651]: 2026-02-01 09:38:50.809 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:50 localhost nova_compute[274651]: 2026-02-01 09:38:50.811 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:38:50 localhost nova_compute[274651]: 2026-02-01 09:38:50.811 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:38:50 localhost nova_compute[274651]: 2026-02-01 09:38:50.811 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:50 localhost nova_compute[274651]: 2026-02-01 09:38:50.840 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:50 localhost nova_compute[274651]: 2026-02-01 09:38:50.841 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:38:52 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:4a:fd:db MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44928 DF PROTO=TCP SPT=55452 DPT=9102 SEQ=3558353216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A1D0AC7B80000000001030307) Feb 1 04:38:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:38:53 localhost podman[276639]: 2026-02-01 09:38:53.077038446 +0000 UTC m=+0.081910488 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, distribution-scope=public, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855) Feb 1 04:38:53 localhost podman[276639]: 2026-02-01 09:38:53.089528881 +0000 UTC m=+0.094400863 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Feb 1 04:38:53 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:38:53 localhost sshd[276660]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:38:53 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 1 04:38:53 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 1 04:38:53 localhost systemd-logind[759]: New session 62 of user tripleo-admin. Feb 1 04:38:53 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 1 04:38:53 localhost systemd[1]: Starting User Manager for UID 1003... Feb 1 04:38:53 localhost systemd[276664]: Queued start job for default target Main User Target. Feb 1 04:38:53 localhost systemd[276664]: Created slice User Application Slice. Feb 1 04:38:53 localhost systemd[276664]: Started Mark boot as successful after the user session has run 2 minutes. Feb 1 04:38:53 localhost systemd-journald[47041]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 1 04:38:53 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:38:53 localhost systemd[276664]: Started Daily Cleanup of User's Temporary Directories. Feb 1 04:38:53 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:38:53 localhost systemd[276664]: Reached target Paths. Feb 1 04:38:53 localhost systemd[276664]: Reached target Timers. Feb 1 04:38:53 localhost systemd[276664]: Starting D-Bus User Message Bus Socket... Feb 1 04:38:53 localhost systemd[276664]: Starting Create User's Volatile Files and Directories... Feb 1 04:38:53 localhost systemd[276664]: Finished Create User's Volatile Files and Directories. Feb 1 04:38:53 localhost systemd[276664]: Listening on D-Bus User Message Bus Socket. Feb 1 04:38:53 localhost systemd[276664]: Reached target Sockets. Feb 1 04:38:53 localhost systemd[276664]: Reached target Basic System. Feb 1 04:38:53 localhost systemd[276664]: Reached target Main User Target. Feb 1 04:38:53 localhost systemd[276664]: Startup finished in 159ms. Feb 1 04:38:53 localhost systemd[1]: Started User Manager for UID 1003. Feb 1 04:38:53 localhost systemd[1]: Started Session 62 of User tripleo-admin. Feb 1 04:38:53 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:38:53 localhost podman[236886]: time="2026-02-01T09:38:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:38:53 localhost podman[236886]: @ - - [01/Feb/2026:09:38:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149814 "" "Go-http-client/1.1" Feb 1 04:38:54 localhost podman[236886]: @ - - [01/Feb/2026:09:38:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17297 "" "Go-http-client/1.1" Feb 1 04:38:54 localhost python3[276807]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)#012add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard (8443)#012add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana (3100)#012add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus (9092)#012add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw (8080)#012add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon (6789, 3300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds (6800-7300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr (6800-7300, 8444)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs (2049, 12049)#012add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"#012# 123 ceph_dashboard (9090, 9094, 9283)#012add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:38:55 localhost sshd[276925]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:38:55 localhost python3[276953]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 1 04:38:55 localhost systemd[1]: Stopping Netfilter Tables... Feb 1 04:38:55 localhost systemd[1]: nftables.service: Deactivated successfully. Feb 1 04:38:55 localhost systemd[1]: Stopped Netfilter Tables. Feb 1 04:38:55 localhost systemd[1]: Starting Netfilter Tables... Feb 1 04:38:55 localhost systemd[1]: Finished Netfilter Tables. Feb 1 04:38:55 localhost nova_compute[274651]: 2026-02-01 09:38:55.841 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:38:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:38:57 localhost systemd[1]: tmp-crun.8lS6za.mount: Deactivated successfully. Feb 1 04:38:57 localhost podman[276977]: 2026-02-01 09:38:57.73617669 +0000 UTC m=+0.092390733 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 1 04:38:57 localhost podman[276977]: 2026-02-01 09:38:57.750376205 +0000 UTC m=+0.106590288 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:38:57 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:39:00 localhost nova_compute[274651]: 2026-02-01 09:39:00.845 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:01 localhost openstack_network_exporter[239441]: ERROR 09:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:39:01 localhost openstack_network_exporter[239441]: Feb 1 04:39:01 localhost openstack_network_exporter[239441]: ERROR 09:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:39:01 localhost openstack_network_exporter[239441]: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.524 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.525 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.545 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a62a0b11-2371-4e41-a1f2-30ced95289c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:39:03.525894', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd8565aba-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.76464427, 'message_signature': '2eda32e2a47ea45983037ba33cc3bb73866d18f819910818cb496aa4a8e028f9'}]}, 'timestamp': '2026-02-01 09:39:03.546058', '_unique_id': '929854bbea0f402888d0da984b20dcf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.548 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.576 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.577 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfac5cdd-3a6d-449b-a0ac-6cfff9eaa165', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.548937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd85b34b8-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': '8d49205b6f72bca9bfb3092ae8eafd4097bb895ddecd2e82de084e5786e876b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.548937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd85b4cd2-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': 'd7354e924448b21a3480a02429d2b6121d3b89a5a099554d963bc88c673580cb'}]}, 'timestamp': '2026-02-01 09:39:03.578384', '_unique_id': 'c8d164b045634845a716a765efbb1f1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.581 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.582 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c284eda-cfbc-4757-aef8-48788afcc68c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.581421', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd85bda76-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': 'a229a17eed5eefc2101f368cfac8609f5953e40aa6004e6640bf3ccbff7881c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.581421', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd85bf060-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': '9767bae86178784b07e9de8cf93075beb4a8d7feed6a4af9b0f751e1651595c9'}]}, 'timestamp': '2026-02-01 09:39:03.582554', '_unique_id': 'edd6c61d2a914e6c8429ce8f356028c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.588 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 5904 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46d17f56-aacc-4d04-bad2-0c3d0f493b7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 5904, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.584971', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd85ceb5a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '2013b880d9c788d11c0e417b3c35decd3fbffc26c2bbb7ed567cc5b44b24ef91'}]}, 'timestamp': '2026-02-01 09:39:03.589047', '_unique_id': '9a45a3f050e8414292fc2963f7fecd34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ad17214-af0d-48dd-95bc-e98bf11d239c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.591452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd85e9388-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.810922289, 'message_signature': '9c99afbd3cc2d546bc7b57a9ed56fe223455a8497e63d4aa21737385e2d51e18'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.591452', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd85eaaa8-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.810922289, 'message_signature': '960d083883e1bfea9f4681535f547c78e18ad266d1bb6dd98807c9378512eee3'}]}, 'timestamp': '2026-02-01 09:39:03.600434', '_unique_id': '8fa0026e46b94993b95ad7412419a556'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.602 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42b8a066-ef6f-468e-a956-23ef018bb44c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.603122', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd85f2a46-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '0b1f358a37a3e2bed8431be9c265d56703be09c7d12e387bf3402b940e77be47'}]}, 'timestamp': '2026-02-01 09:39:03.603749', '_unique_id': 'cdc0ddf4db8244659a701c971652efaa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.606 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.606 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.606 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95814735-8433-44ab-a647-582eeba06c45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.606256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd85fa444-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': 'db6fde57eeb7924ee3092eda310087ddacc7c3ac9241afc1e1d5b512fc0727f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.606256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd85fb970-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': 'ad41ddd2561a8edc611864716134e45119b286a4ee2b28c16f80151e002ddcc4'}]}, 'timestamp': '2026-02-01 09:39:03.607358', '_unique_id': 'afadf4c65b224d9492a644b3b48597ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2156fab-9ae6-47cf-a08e-dec10eccb3d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.610038', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd860349a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.810922289, 'message_signature': '3b51b15f059339d6033cd84152840e4694747f5566b9f562086733b6b96c835a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.610038', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd8604552-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.810922289, 'message_signature': '21338df00d8e2833c46e18889cda6002ea97068aa05890bc1473990045d5446e'}]}, 'timestamp': '2026-02-01 09:39:03.610920', '_unique_id': '9b8cb2195c6946c08e34e70f5fcd7f8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.613 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.613 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 10500000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c664d2e-14f3-4aab-9332-bde80bfd2fd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10500000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:39:03.613166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd860ae20-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.76464427, 'message_signature': '288f68a1d90ecb74e9e0254750a5e7e4accd2340f337ec7c55e0d78fcec05036'}]}, 'timestamp': '2026-02-01 09:39:03.613618', '_unique_id': '1ab2f599fcd04b588b9e81268126a6be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.616 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '848f1a40-1c0d-4106-8474-a83349e91239', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.616077', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd8612120-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': 'e6f3968e351ccd98621b95f302b356ecf1a8e0f8246f2b3d54388fed1c38d6dd'}]}, 'timestamp': '2026-02-01 09:39:03.616576', '_unique_id': '56e95b4a53ee4f51a9a756b00d148237'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.618 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a905282-bf3a-4071-8d72-7c1468c37564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.618743', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd8618af2-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '58ee3ab69b6dbcfbadaec638353299bcbc5744279e47aea88f1708cea4c72844'}]}, 'timestamp': '2026-02-01 09:39:03.619314', '_unique_id': '2fa2dcd56ef5434d88f033d4be001157'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.621 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5910954d-280b-48db-b3c6-5821c4ae5414', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.621596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd861f762-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': '27ca69e97e5ec4ff1a119ef3030346f8aaed5380d54b2c1368410c211aa5dea6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.621596', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd86209a0-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': 'd12ab16de239160ee6a5821763b2b5d02eec3124698533ea1ef07e082662df60'}]}, 'timestamp': '2026-02-01 09:39:03.622498', '_unique_id': '3cfc0e3220dc4a2d9ef745db5e431af9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.624 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c049f3b-788f-4581-90bf-f933ea67ab81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.624799', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd86275d4-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '5582e0c8535e42ff766b54c0a08a154333b803008a796725db4fede062114809'}]}, 'timestamp': '2026-02-01 09:39:03.625300', '_unique_id': '58cf4493453f4ad9bcec3a13dbc6bb05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.627 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.627 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b077287e-320f-47c1-a1f1-6ca0b273c522', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.628100', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd862f978-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '67fbd056dcd84dbfde95d276ee8333e303ceb51ed27dec59af618cf318a8199a'}]}, 'timestamp': '2026-02-01 09:39:03.628680', '_unique_id': '2e04d71c446b4211a231136f7d0a8fcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.630 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a822cbe0-a7e2-49b7-9d3b-487c20841cad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.630224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd8634770-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': '9e8702e2235ca64c35875a831e5acef16cb62379ab9d4ae58899c808cc16639f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.630224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd863536e-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': '508bdc5485c1332e5fcf2f3f63e9f0ee4ba06178b84271df3294cd6ab1a8ab43'}]}, 'timestamp': '2026-02-01 09:39:03.630873', '_unique_id': '7575d7cc6d6c4612bb1d24def92c0bd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20ccb45b-24c5-45db-9f7b-f80b43e94878', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.632376', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd8639b76-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.810922289, 'message_signature': '713863dce85708db19c72fcf076668c5587c676812de6061694f207fef112e9b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.632376', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd863a7c4-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.810922289, 'message_signature': '3014b0ff0ede5b8e414feccf5c696b51a6c2171ec32698d548d8ead6824850a0'}]}, 'timestamp': '2026-02-01 09:39:03.633068', '_unique_id': 'db3e2dd5376b4233a4b2fe7471d36726'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.634 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.634 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d333f35-2668-4f76-8516-b7a24d1eb0bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.634619', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd863f35a-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': 'b3bb4b6fc9f240595b703031993b3aabc3ce3b1d880e918ef457087c06945d74'}]}, 'timestamp': '2026-02-01 09:39:03.635065', '_unique_id': '05718562000e457eb60a018ac7d59149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.635 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.636 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 8828 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da4fae21-e445-4da4-b2bb-aa35b19c7e47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 8828, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.636532', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd8643dec-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '79a1c51445e233408a026fb30f6098ee9560a24c82d6c19d01194113bd9b9ba3'}]}, 'timestamp': '2026-02-01 09:39:03.636898', '_unique_id': 'a30dad5336a54001b034778d105c958f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa673c33-a6c2-40fd-94a8-ca38deaf0d8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:39:03.638329', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd86483f6-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': '21752b80938a31a919e4b38b7213ed673236f60a62a41ba807052450b5ec34a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:39:03.638329', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd8648fcc-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.768443695, 'message_signature': '501b0d47edb2631e2137a9c151a8527ac78e458d83ade9e4bcedddc0cf1a04fc'}]}, 'timestamp': '2026-02-01 09:39:03.638973', '_unique_id': '4ae9d0354bd244879bd6b9258f612e2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.639 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.640 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c42d705d-843e-4cc7-9a20-e95879023be7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.640490', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd864d89c-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '0cbc1834e853560a0e9d1cd92acfcf7815d601ba0694eba26e4e11422fc402e9'}]}, 'timestamp': '2026-02-01 09:39:03.640861', '_unique_id': 'af6193b4065946918f41ba67dda3e73f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.642 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.642 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acebd5ac-38ee-49da-98ae-8e62aabf0b12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:39:03.642422', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'd8652432-ff51-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10837.804506967, 'message_signature': '0d60efe7623b0b42b8c53dd825e8101109dde2eeaeea00b37764ced1ee495ff1'}]}, 'timestamp': '2026-02-01 09:39:03.642786', '_unique_id': 'ed9c142085994dde9afabc8371fb61a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:39:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:39:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 04:39:05 localhost nova_compute[274651]: 2026-02-01 09:39:05.848 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:39:07 localhost systemd[1]: tmp-crun.wlnO2g.mount: Deactivated successfully. Feb 1 04:39:07 localhost podman[277050]: 2026-02-01 09:39:07.717320179 +0000 UTC m=+0.073378392 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:39:07 localhost podman[277050]: 2026-02-01 09:39:07.727109992 +0000 UTC m=+0.083168315 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:39:07 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:39:10 localhost nova_compute[274651]: 2026-02-01 09:39:10.851 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:39:14 localhost podman[277127]: 2026-02-01 09:39:14.733113297 +0000 UTC m=+0.092153395 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 1 04:39:14 localhost podman[277127]: 2026-02-01 09:39:14.768459868 +0000 UTC m=+0.127499996 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible) Feb 1 04:39:14 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:39:15 localhost nova_compute[274651]: 2026-02-01 09:39:15.854 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:39:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:39:16 localhost systemd[1]: tmp-crun.rzxVhu.mount: Deactivated successfully. Feb 1 04:39:16 localhost podman[277145]: 2026-02-01 09:39:16.747607158 +0000 UTC m=+0.097574587 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:39:16 localhost podman[277146]: 2026-02-01 09:39:16.78932524 +0000 UTC m=+0.135542696 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:39:16 localhost podman[277145]: 2026-02-01 09:39:16.812513775 +0000 UTC m=+0.162481204 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:39:16 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:39:16 localhost podman[277146]: 2026-02-01 09:39:16.868446803 +0000 UTC m=+0.214664239 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:39:16 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:39:17 localhost podman[277269]: Feb 1 04:39:17 localhost podman[277269]: 2026-02-01 09:39:17.619364715 +0000 UTC m=+0.086182245 container create 281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_noether, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Feb 1 04:39:17 localhost systemd[1]: Started libpod-conmon-281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264.scope. Feb 1 04:39:17 localhost systemd[1]: Started libcrun container. Feb 1 04:39:17 localhost podman[277269]: 2026-02-01 09:39:17.583675326 +0000 UTC m=+0.050492916 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:39:17 localhost podman[277269]: 2026-02-01 09:39:17.693698665 +0000 UTC m=+0.160516185 container init 281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_noether, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.41.4, architecture=x86_64) Feb 1 04:39:17 localhost podman[277269]: 2026-02-01 09:39:17.703085836 +0000 UTC m=+0.169903356 container start 281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_noether, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, distribution-scope=public, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, ceph=True, io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:39:17 localhost podman[277269]: 2026-02-01 09:39:17.703578681 +0000 UTC m=+0.170396241 container attach 281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_noether, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, distribution-scope=public) Feb 1 04:39:17 localhost upbeat_noether[277285]: 167 167 Feb 1 04:39:17 localhost systemd[1]: libpod-281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264.scope: Deactivated successfully. Feb 1 04:39:17 localhost podman[277269]: 2026-02-01 09:39:17.70922361 +0000 UTC m=+0.176041160 container died 281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_noether, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, version=7, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.) Feb 1 04:39:17 localhost systemd[1]: var-lib-containers-storage-overlay-cef98227127bf3e967cb1da67dd3b26ac6a0b7ae31b701a047e3e5b4f8f685c7-merged.mount: Deactivated successfully. Feb 1 04:39:17 localhost podman[277290]: 2026-02-01 09:39:17.823637372 +0000 UTC m=+0.100497025 container remove 281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_noether, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, RELEASE=main, io.buildah.version=1.41.4) Feb 1 04:39:17 localhost systemd[1]: libpod-conmon-281d7e81e98066b0dcaa4db9bd23618b04de64905117cc198917a352de73c264.scope: Deactivated successfully. Feb 1 04:39:17 localhost systemd[1]: Reloading. Feb 1 04:39:18 localhost systemd-sysv-generator[277337]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:39:18 localhost systemd-rc-local-generator[277331]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: Reloading. Feb 1 04:39:18 localhost systemd-sysv-generator[277376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:39:18 localhost systemd-rc-local-generator[277370]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:39:18 localhost systemd[1]: Starting Ceph mds.mds.np0005604212.tkdkxt for 33fac0b9-80c7-560f-918a-c92d3021ca1e... Feb 1 04:39:19 localhost podman[277436]: Feb 1 04:39:19 localhost podman[277436]: 2026-02-01 09:39:19.165712396 +0000 UTC m=+0.081951489 container create 7129047be6faf387f266be4dc405a28cd413c9f5a200afa359957efd17c3d5c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604212-tkdkxt, ceph=True, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, version=7, release=1764794109, maintainer=Guillaume Abrioux , io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main) Feb 1 04:39:19 localhost systemd[1]: tmp-crun.lEYgh1.mount: Deactivated successfully. Feb 1 04:39:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f6cc0f5a29445c664fd33b7c1c43e72868df27d06f1ab0f002db0c5727b5f9a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:39:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f6cc0f5a29445c664fd33b7c1c43e72868df27d06f1ab0f002db0c5727b5f9a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 04:39:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f6cc0f5a29445c664fd33b7c1c43e72868df27d06f1ab0f002db0c5727b5f9a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 04:39:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f6cc0f5a29445c664fd33b7c1c43e72868df27d06f1ab0f002db0c5727b5f9a/merged/var/lib/ceph/mds/ceph-mds.np0005604212.tkdkxt supports timestamps until 2038 (0x7fffffff) Feb 1 04:39:19 localhost podman[277436]: 2026-02-01 09:39:19.132301663 +0000 UTC m=+0.048540806 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:39:19 localhost podman[277436]: 2026-02-01 09:39:19.236365836 +0000 UTC m=+0.152604919 container init 7129047be6faf387f266be4dc405a28cd413c9f5a200afa359957efd17c3d5c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604212-tkdkxt, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, maintainer=Guillaume Abrioux ) Feb 1 04:39:19 localhost systemd[1]: tmp-crun.abq4Ur.mount: Deactivated successfully. Feb 1 04:39:19 localhost podman[277436]: 2026-02-01 09:39:19.250086916 +0000 UTC m=+0.166326009 container start 7129047be6faf387f266be4dc405a28cd413c9f5a200afa359957efd17c3d5c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604212-tkdkxt, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph) Feb 1 04:39:19 localhost bash[277436]: 7129047be6faf387f266be4dc405a28cd413c9f5a200afa359957efd17c3d5c2 Feb 1 04:39:19 localhost systemd[1]: Started Ceph mds.mds.np0005604212.tkdkxt for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 04:39:19 localhost ceph-mds[277455]: set uid:gid to 167:167 (ceph:ceph) Feb 1 04:39:19 localhost ceph-mds[277455]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2 Feb 1 04:39:19 localhost ceph-mds[277455]: main not setting numa affinity Feb 1 04:39:19 localhost ceph-mds[277455]: pidfile_write: ignore empty --pid-file Feb 1 04:39:19 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604212-tkdkxt[277451]: starting mds.mds.np0005604212.tkdkxt at Feb 1 04:39:19 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt Updating MDS map to version 8 from mon.1 Feb 1 04:39:19 localhost systemd[1]: session-61.scope: Deactivated successfully. Feb 1 04:39:19 localhost systemd-logind[759]: Session 61 logged out. Waiting for processes to exit. Feb 1 04:39:19 localhost systemd-logind[759]: Removed session 61. Feb 1 04:39:19 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt Updating MDS map to version 9 from mon.1 Feb 1 04:39:19 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt Monitors have assigned me to become a standby. Feb 1 04:39:20 localhost nova_compute[274651]: 2026-02-01 09:39:20.858 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:21 localhost podman[277601]: 2026-02-01 09:39:21.019538949 +0000 UTC m=+0.109675350 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, vcs-type=git, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64) Feb 1 04:39:21 localhost podman[277601]: 2026-02-01 09:39:21.120557959 +0000 UTC m=+0.210694800 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=1764794109, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:39:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:39:23 localhost podman[277723]: 2026-02-01 09:39:23.738270502 +0000 UTC m=+0.092414373 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1769056855, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:39:23 localhost podman[277723]: 2026-02-01 09:39:23.778696015 +0000 UTC m=+0.132839885 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public) Feb 1 04:39:23 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:39:23 localhost podman[236886]: time="2026-02-01T09:39:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:39:23 localhost podman[236886]: @ - - [01/Feb/2026:09:39:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152020 "" "Go-http-client/1.1" Feb 1 04:39:24 localhost podman[236886]: @ - - [01/Feb/2026:09:39:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17779 "" "Go-http-client/1.1" Feb 1 04:39:25 localhost nova_compute[274651]: 2026-02-01 09:39:25.861 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:25 localhost nova_compute[274651]: 2026-02-01 09:39:25.864 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:25 localhost nova_compute[274651]: 2026-02-01 09:39:25.864 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:39:25 localhost nova_compute[274651]: 2026-02-01 09:39:25.865 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:25 localhost nova_compute[274651]: 2026-02-01 09:39:25.882 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:25 localhost nova_compute[274651]: 2026-02-01 09:39:25.882 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:39:28 localhost podman[277743]: 2026-02-01 09:39:28.725786554 +0000 UTC m=+0.087550617 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:39:28 localhost podman[277743]: 2026-02-01 09:39:28.742544117 +0000 UTC m=+0.104308240 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:39:28 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:39:30 localhost nova_compute[274651]: 2026-02-01 09:39:30.883 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:30 localhost nova_compute[274651]: 2026-02-01 09:39:30.886 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:30 localhost nova_compute[274651]: 2026-02-01 09:39:30.886 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:39:30 localhost nova_compute[274651]: 2026-02-01 09:39:30.887 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:30 localhost nova_compute[274651]: 2026-02-01 09:39:30.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:30 localhost nova_compute[274651]: 2026-02-01 09:39:30.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:31 localhost openstack_network_exporter[239441]: ERROR 09:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:39:31 localhost openstack_network_exporter[239441]: Feb 1 04:39:31 localhost openstack_network_exporter[239441]: ERROR 09:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:39:31 localhost openstack_network_exporter[239441]: Feb 1 04:39:35 localhost nova_compute[274651]: 2026-02-01 09:39:35.923 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:35 localhost nova_compute[274651]: 2026-02-01 09:39:35.926 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:35 localhost nova_compute[274651]: 2026-02-01 09:39:35.926 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:39:35 localhost nova_compute[274651]: 2026-02-01 09:39:35.927 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:35 localhost nova_compute[274651]: 2026-02-01 09:39:35.951 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:35 localhost nova_compute[274651]: 2026-02-01 09:39:35.952 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:39:38 localhost podman[277761]: 2026-02-01 09:39:38.720787498 +0000 UTC m=+0.081089504 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:39:38 localhost podman[277761]: 2026-02-01 09:39:38.728560481 +0000 UTC m=+0.088862487 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:39:38 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.464 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.465 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.488 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.489 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.489 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.789 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.790 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.790 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:39:39 localhost nova_compute[274651]: 2026-02-01 09:39:39.790 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.279 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.306 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.307 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.307 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.308 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.308 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.309 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.309 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.310 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.310 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.311 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.332 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.332 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.333 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.333 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.333 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.793 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.849 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.849 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.953 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.955 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.955 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.955 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.997 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:40 localhost nova_compute[274651]: 2026-02-01 09:39:40.997 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.064 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.065 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12042MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.065 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.066 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.146 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.146 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.147 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.195 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.641 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.649 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.671 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.674 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:39:41 localhost nova_compute[274651]: 2026-02-01 09:39:41.674 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:39:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:39:41.701 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:39:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:39:41.702 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:39:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:39:41.703 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:39:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:39:45 localhost podman[277828]: 2026-02-01 09:39:45.704492913 +0000 UTC m=+0.066460685 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent) Feb 1 04:39:45 localhost podman[277828]: 2026-02-01 09:39:45.707857724 +0000 UTC m=+0.069825526 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Feb 1 04:39:45 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:39:45 localhost nova_compute[274651]: 2026-02-01 09:39:45.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:46 localhost nova_compute[274651]: 2026-02-01 09:39:46.000 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:46 localhost nova_compute[274651]: 2026-02-01 09:39:46.000 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:39:46 localhost nova_compute[274651]: 2026-02-01 09:39:46.000 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:46 localhost nova_compute[274651]: 2026-02-01 09:39:46.047 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:46 localhost nova_compute[274651]: 2026-02-01 09:39:46.048 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:39:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:39:47 localhost podman[277916]: 2026-02-01 09:39:47.743209032 +0000 UTC m=+0.085800985 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:39:47 localhost podman[277916]: 2026-02-01 09:39:47.779898932 +0000 UTC m=+0.122490865 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:39:47 localhost systemd[1]: tmp-crun.57G32w.mount: Deactivated successfully. Feb 1 04:39:47 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:39:47 localhost podman[277915]: 2026-02-01 09:39:47.80783956 +0000 UTC m=+0.152426553 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:39:47 localhost podman[277915]: 2026-02-01 09:39:47.846441127 +0000 UTC m=+0.191028120 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:39:47 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:39:51 localhost nova_compute[274651]: 2026-02-01 09:39:51.049 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:51 localhost nova_compute[274651]: 2026-02-01 09:39:51.051 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:51 localhost nova_compute[274651]: 2026-02-01 09:39:51.051 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:39:51 localhost nova_compute[274651]: 2026-02-01 09:39:51.051 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:51 localhost nova_compute[274651]: 2026-02-01 09:39:51.087 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:51 localhost nova_compute[274651]: 2026-02-01 09:39:51.088 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:52 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt Updating MDS map to version 11 from mon.1 Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.11 handle_mds_map i am now mds.0.11 Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.11 handle_mds_map state change up:standby --> up:replay Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.11 replay_start Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.11 waiting for osdmap 79 (which blocklists prior instance) Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.cache creating system inode with ino:0x100 Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.cache creating system inode with ino:0x1 Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.11 Finished replaying journal Feb 1 04:39:52 localhost ceph-mds[277455]: mds.0.11 making mds journal writeable Feb 1 04:39:53 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt Updating MDS map to version 12 from mon.1 Feb 1 04:39:53 localhost ceph-mds[277455]: mds.0.11 handle_mds_map i am now mds.0.11 Feb 1 04:39:53 localhost ceph-mds[277455]: mds.0.11 handle_mds_map state change up:replay --> up:reconnect Feb 1 04:39:53 localhost ceph-mds[277455]: mds.0.11 reconnect_start Feb 1 04:39:53 localhost ceph-mds[277455]: mds.0.11 reopen_log Feb 1 04:39:53 localhost ceph-mds[277455]: mds.0.11 reconnect_done Feb 1 04:39:53 localhost podman[236886]: time="2026-02-01T09:39:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:39:53 localhost podman[236886]: @ - - [01/Feb/2026:09:39:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152020 "" "Go-http-client/1.1" Feb 1 04:39:54 localhost podman[236886]: @ - - [01/Feb/2026:09:39:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17776 "" "Go-http-client/1.1" Feb 1 04:39:54 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt Updating MDS map to version 13 from mon.1 Feb 1 04:39:54 localhost ceph-mds[277455]: mds.0.11 handle_mds_map i am now mds.0.11 Feb 1 04:39:54 localhost ceph-mds[277455]: mds.0.11 handle_mds_map state change up:reconnect --> up:rejoin Feb 1 04:39:54 localhost ceph-mds[277455]: mds.0.11 rejoin_start Feb 1 04:39:54 localhost ceph-mds[277455]: mds.0.11 rejoin_joint_start Feb 1 04:39:54 localhost ceph-mds[277455]: mds.0.11 rejoin_done Feb 1 04:39:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:39:54 localhost systemd[1]: tmp-crun.TBHmfn.mount: Deactivated successfully. Feb 1 04:39:54 localhost podman[277975]: 2026-02-01 09:39:54.733181847 +0000 UTC m=+0.091905688 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, version=9.7, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.openshift.tags=minimal rhel9) Feb 1 04:39:54 localhost podman[277975]: 2026-02-01 09:39:54.749433175 +0000 UTC m=+0.108156996 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter, version=9.7, config_id=openstack_network_exporter, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Feb 1 04:39:54 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:39:55 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt Updating MDS map to version 14 from mon.1 Feb 1 04:39:55 localhost ceph-mds[277455]: mds.0.11 handle_mds_map i am now mds.0.11 Feb 1 04:39:55 localhost ceph-mds[277455]: mds.0.11 handle_mds_map state change up:rejoin --> up:active Feb 1 04:39:55 localhost ceph-mds[277455]: mds.0.11 recovery_done -- successful recovery! Feb 1 04:39:55 localhost ceph-mds[277455]: mds.0.11 active_start Feb 1 04:39:55 localhost ceph-mds[277455]: mds.0.11 cluster recovered. Feb 1 04:39:55 localhost systemd[1]: session-62.scope: Deactivated successfully. Feb 1 04:39:55 localhost systemd[1]: session-62.scope: Consumed 1.255s CPU time. Feb 1 04:39:55 localhost systemd-logind[759]: Session 62 logged out. Waiting for processes to exit. Feb 1 04:39:55 localhost systemd-logind[759]: Removed session 62. Feb 1 04:39:56 localhost nova_compute[274651]: 2026-02-01 09:39:56.089 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:56 localhost nova_compute[274651]: 2026-02-01 09:39:56.091 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:39:56 localhost nova_compute[274651]: 2026-02-01 09:39:56.091 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:39:56 localhost nova_compute[274651]: 2026-02-01 09:39:56.091 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:56 localhost nova_compute[274651]: 2026-02-01 09:39:56.125 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:39:56 localhost nova_compute[274651]: 2026-02-01 09:39:56.126 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:39:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:39:59 localhost podman[278000]: 2026-02-01 09:39:59.7243805 +0000 UTC m=+0.087655650 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:39:59 localhost podman[278000]: 2026-02-01 09:39:59.765407591 +0000 UTC m=+0.128682751 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:39:59 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:39:59 localhost ceph-mds[277455]: mds.pinger is_rank_lagging: rank=0 was never sent ping request. Feb 1 04:39:59 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604212-tkdkxt[277451]: 2026-02-01T09:39:59.843+0000 7f47dd854640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request. Feb 1 04:40:01 localhost nova_compute[274651]: 2026-02-01 09:40:01.127 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:01 localhost nova_compute[274651]: 2026-02-01 09:40:01.129 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:01 localhost nova_compute[274651]: 2026-02-01 09:40:01.129 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:01 localhost nova_compute[274651]: 2026-02-01 09:40:01.130 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:01 localhost nova_compute[274651]: 2026-02-01 09:40:01.157 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:01 localhost nova_compute[274651]: 2026-02-01 09:40:01.158 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:01 localhost openstack_network_exporter[239441]: ERROR 09:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:40:01 localhost openstack_network_exporter[239441]: Feb 1 04:40:01 localhost openstack_network_exporter[239441]: ERROR 09:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:40:01 localhost openstack_network_exporter[239441]: Feb 1 04:40:04 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Feb 1 04:40:05 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 1 04:40:05 localhost systemd[276664]: Activating special unit Exit the Session... Feb 1 04:40:05 localhost systemd[276664]: Stopped target Main User Target. Feb 1 04:40:05 localhost systemd[276664]: Stopped target Basic System. Feb 1 04:40:05 localhost systemd[276664]: Stopped target Paths. Feb 1 04:40:05 localhost systemd[276664]: Stopped target Sockets. Feb 1 04:40:05 localhost systemd[276664]: Stopped target Timers. Feb 1 04:40:05 localhost systemd[276664]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 1 04:40:05 localhost systemd[276664]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 04:40:05 localhost systemd[276664]: Closed D-Bus User Message Bus Socket. Feb 1 04:40:05 localhost systemd[276664]: Stopped Create User's Volatile Files and Directories. Feb 1 04:40:05 localhost systemd[276664]: Removed slice User Application Slice. Feb 1 04:40:05 localhost systemd[276664]: Reached target Shutdown. Feb 1 04:40:05 localhost systemd[276664]: Finished Exit the Session. Feb 1 04:40:05 localhost systemd[276664]: Reached target Exit the Session. Feb 1 04:40:05 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 1 04:40:05 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 1 04:40:05 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 1 04:40:05 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 1 04:40:05 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 1 04:40:05 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 1 04:40:05 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 1 04:40:05 localhost systemd[1]: user-1003.slice: Consumed 1.649s CPU time. Feb 1 04:40:06 localhost nova_compute[274651]: 2026-02-01 09:40:06.159 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:06 localhost nova_compute[274651]: 2026-02-01 09:40:06.163 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:06 localhost nova_compute[274651]: 2026-02-01 09:40:06.163 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:06 localhost nova_compute[274651]: 2026-02-01 09:40:06.163 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:06 localhost nova_compute[274651]: 2026-02-01 09:40:06.207 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:06 localhost nova_compute[274651]: 2026-02-01 09:40:06.208 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:07 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Feb 1 04:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:40:09 localhost podman[278023]: 2026-02-01 09:40:09.734317922 +0000 UTC m=+0.091428524 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:40:09 localhost podman[278023]: 2026-02-01 09:40:09.744350922 +0000 UTC m=+0.101461494 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:40:09 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:40:11 localhost nova_compute[274651]: 2026-02-01 09:40:11.208 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:11 localhost nova_compute[274651]: 2026-02-01 09:40:11.210 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:11 localhost nova_compute[274651]: 2026-02-01 09:40:11.210 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:11 localhost nova_compute[274651]: 2026-02-01 09:40:11.211 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:11 localhost nova_compute[274651]: 2026-02-01 09:40:11.253 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:11 localhost nova_compute[274651]: 2026-02-01 09:40:11.253 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:40:16 localhost podman[278101]: 2026-02-01 09:40:16.221467575 +0000 UTC m=+0.090762348 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:40:16 localhost nova_compute[274651]: 2026-02-01 09:40:16.255 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:16 localhost nova_compute[274651]: 2026-02-01 09:40:16.257 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:16 localhost nova_compute[274651]: 2026-02-01 09:40:16.257 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:16 localhost nova_compute[274651]: 2026-02-01 09:40:16.257 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:16 localhost podman[278101]: 2026-02-01 09:40:16.258692151 +0000 UTC m=+0.127986924 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:40:16 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:40:16 localhost nova_compute[274651]: 2026-02-01 09:40:16.283 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:16 localhost nova_compute[274651]: 2026-02-01 09:40:16.284 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:40:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:40:18 localhost podman[278120]: 2026-02-01 09:40:18.727447707 +0000 UTC m=+0.083637441 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:40:18 localhost podman[278119]: 2026-02-01 09:40:18.811755528 +0000 UTC m=+0.170774199 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:40:18 localhost podman[278119]: 2026-02-01 09:40:18.848476908 +0000 UTC m=+0.207495559 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:40:18 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:40:18 localhost podman[278120]: 2026-02-01 09:40:18.899488483 +0000 UTC m=+0.255678217 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:40:18 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:40:21 localhost nova_compute[274651]: 2026-02-01 09:40:21.285 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:21 localhost nova_compute[274651]: 2026-02-01 09:40:21.287 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:21 localhost nova_compute[274651]: 2026-02-01 09:40:21.287 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:21 localhost nova_compute[274651]: 2026-02-01 09:40:21.287 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:21 localhost nova_compute[274651]: 2026-02-01 09:40:21.307 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:21 localhost nova_compute[274651]: 2026-02-01 09:40:21.308 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:23 localhost podman[236886]: time="2026-02-01T09:40:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:40:23 localhost podman[236886]: @ - - [01/Feb/2026:09:40:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152020 "" "Go-http-client/1.1" Feb 1 04:40:24 localhost podman[236886]: @ - - [01/Feb/2026:09:40:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17785 "" "Go-http-client/1.1" Feb 1 04:40:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:40:25 localhost podman[278165]: 2026-02-01 09:40:25.72088902 +0000 UTC m=+0.081383813 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.7) Feb 1 04:40:25 localhost podman[278165]: 2026-02-01 09:40:25.736371172 +0000 UTC m=+0.096865955 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, version=9.7, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git) Feb 1 04:40:25 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:40:26 localhost nova_compute[274651]: 2026-02-01 09:40:26.309 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:26 localhost nova_compute[274651]: 2026-02-01 09:40:26.311 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:26 localhost nova_compute[274651]: 2026-02-01 09:40:26.312 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:26 localhost nova_compute[274651]: 2026-02-01 09:40:26.312 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:26 localhost nova_compute[274651]: 2026-02-01 09:40:26.346 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:26 localhost nova_compute[274651]: 2026-02-01 09:40:26.347 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:40:30 localhost podman[278185]: 2026-02-01 09:40:30.73435213 +0000 UTC m=+0.089732487 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute) Feb 1 04:40:30 localhost podman[278185]: 2026-02-01 09:40:30.747414899 +0000 UTC m=+0.102795266 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 04:40:30 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:40:31 localhost nova_compute[274651]: 2026-02-01 09:40:31.348 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:31 localhost nova_compute[274651]: 2026-02-01 09:40:31.350 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:31 localhost nova_compute[274651]: 2026-02-01 09:40:31.350 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:31 localhost nova_compute[274651]: 2026-02-01 09:40:31.351 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:31 localhost nova_compute[274651]: 2026-02-01 09:40:31.377 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:31 localhost nova_compute[274651]: 2026-02-01 09:40:31.378 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:31 localhost openstack_network_exporter[239441]: ERROR 09:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:40:31 localhost openstack_network_exporter[239441]: Feb 1 04:40:31 localhost openstack_network_exporter[239441]: ERROR 09:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:40:31 localhost openstack_network_exporter[239441]: Feb 1 04:40:34 localhost nova_compute[274651]: 2026-02-01 09:40:34.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:34 localhost nova_compute[274651]: 2026-02-01 09:40:34.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 04:40:34 localhost nova_compute[274651]: 2026-02-01 09:40:34.288 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 04:40:34 localhost nova_compute[274651]: 2026-02-01 09:40:34.289 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:34 localhost nova_compute[274651]: 2026-02-01 09:40:34.289 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 04:40:34 localhost nova_compute[274651]: 2026-02-01 09:40:34.305 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:36 localhost nova_compute[274651]: 2026-02-01 09:40:36.379 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:36 localhost nova_compute[274651]: 2026-02-01 09:40:36.382 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:36 localhost nova_compute[274651]: 2026-02-01 09:40:36.382 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:36 localhost nova_compute[274651]: 2026-02-01 09:40:36.382 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:36 localhost nova_compute[274651]: 2026-02-01 09:40:36.411 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:36 localhost nova_compute[274651]: 2026-02-01 09:40:36.411 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:37 localhost nova_compute[274651]: 2026-02-01 09:40:37.312 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:38 localhost nova_compute[274651]: 2026-02-01 09:40:38.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:38 localhost nova_compute[274651]: 2026-02-01 09:40:38.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:40:38 localhost nova_compute[274651]: 2026-02-01 09:40:38.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:40:38 localhost nova_compute[274651]: 2026-02-01 09:40:38.797 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:40:38 localhost nova_compute[274651]: 2026-02-01 09:40:38.798 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:40:38 localhost nova_compute[274651]: 2026-02-01 09:40:38.798 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:40:38 localhost nova_compute[274651]: 2026-02-01 09:40:38.799 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.219 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.235 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.235 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.235 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.236 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.253 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.254 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.254 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.255 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.255 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.737 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.853 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:40:39 localhost nova_compute[274651]: 2026-02-01 09:40:39.853 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.066 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.067 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=12023MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.068 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.068 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.194 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.195 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.195 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.257 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.290 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.290 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.344 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.373 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI2,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.411 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:40:40 localhost podman[278281]: 2026-02-01 09:40:40.556930636 +0000 UTC m=+0.077518255 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:40:40 localhost podman[278281]: 2026-02-01 09:40:40.569256572 +0000 UTC m=+0.089844181 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:40:40 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.922 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.929 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.952 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.956 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.956 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.990 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.991 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.991 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.992 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:40 localhost nova_compute[274651]: 2026-02-01 09:40:40.992 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:40:41 localhost nova_compute[274651]: 2026-02-01 09:40:41.413 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:41 localhost nova_compute[274651]: 2026-02-01 09:40:41.415 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:41 localhost nova_compute[274651]: 2026-02-01 09:40:41.416 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:41 localhost nova_compute[274651]: 2026-02-01 09:40:41.416 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:41 localhost nova_compute[274651]: 2026-02-01 09:40:41.458 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:41 localhost nova_compute[274651]: 2026-02-01 09:40:41.460 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:40:41.701 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:40:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:40:41.702 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:40:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:40:41.703 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:40:42 localhost nova_compute[274651]: 2026-02-01 09:40:42.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:40:42 localhost podman[278404]: Feb 1 04:40:42 localhost podman[278404]: 2026-02-01 09:40:42.394042221 +0000 UTC m=+0.077062302 container create d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_diffie, io.k8s.description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, name=rhceph) Feb 1 04:40:42 localhost systemd[1]: Started libpod-conmon-d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0.scope. Feb 1 04:40:42 localhost systemd[1]: Started libcrun container. Feb 1 04:40:42 localhost podman[278404]: 2026-02-01 09:40:42.36186822 +0000 UTC m=+0.044888331 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:40:42 localhost podman[278404]: 2026-02-01 09:40:42.467183591 +0000 UTC m=+0.150203662 container init d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_diffie, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:40:42 localhost podman[278404]: 2026-02-01 09:40:42.477745863 +0000 UTC m=+0.160765954 container start d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_diffie, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1764794109, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True) Feb 1 04:40:42 localhost podman[278404]: 2026-02-01 09:40:42.480053404 +0000 UTC m=+0.163073525 container attach d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_diffie, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:40:42 localhost silly_diffie[278421]: 167 167 Feb 1 04:40:42 localhost systemd[1]: libpod-d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0.scope: Deactivated successfully. Feb 1 04:40:42 localhost podman[278404]: 2026-02-01 09:40:42.483725446 +0000 UTC m=+0.166745517 container died d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_diffie, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_BRANCH=main) Feb 1 04:40:42 localhost podman[278426]: 2026-02-01 09:40:42.583261561 +0000 UTC m=+0.085788637 container remove d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_diffie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , version=7, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, release=1764794109, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph) Feb 1 04:40:42 localhost systemd[1]: libpod-conmon-d74d7220952600553395133e5273d98aca5791d205ea70139634aa6659e5fda0.scope: Deactivated successfully. Feb 1 04:40:42 localhost systemd[1]: Reloading. Feb 1 04:40:42 localhost systemd-sysv-generator[278470]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:40:42 localhost systemd-rc-local-generator[278467]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:42 localhost systemd[1]: var-lib-containers-storage-overlay-0d072e61d47ba345343a209078db087dd18109c4a3021968a79d491edcccb771-merged.mount: Deactivated successfully. Feb 1 04:40:43 localhost systemd[1]: Reloading. Feb 1 04:40:43 localhost systemd-rc-local-generator[278509]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:40:43 localhost systemd-sysv-generator[278515]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:40:43 localhost systemd[1]: Starting Ceph mgr.np0005604212.oynhpm for 33fac0b9-80c7-560f-918a-c92d3021ca1e... Feb 1 04:40:43 localhost podman[278573]: Feb 1 04:40:43 localhost podman[278573]: 2026-02-01 09:40:43.722282966 +0000 UTC m=+0.066446077 container create 2a0e8e2148165aa8154ff575e0657da0d61273d7fe6b18803343158471a37d5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1764794109) Feb 1 04:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4b9f55347caa21444ae46d1c5db6bda3c1e1e0909a2b5c262fb84ae93eadb4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 04:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4b9f55347caa21444ae46d1c5db6bda3c1e1e0909a2b5c262fb84ae93eadb4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4b9f55347caa21444ae46d1c5db6bda3c1e1e0909a2b5c262fb84ae93eadb4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 04:40:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4b9f55347caa21444ae46d1c5db6bda3c1e1e0909a2b5c262fb84ae93eadb4/merged/var/lib/ceph/mgr/ceph-np0005604212.oynhpm supports timestamps until 2038 (0x7fffffff) Feb 1 04:40:43 localhost podman[278573]: 2026-02-01 09:40:43.787285959 +0000 UTC m=+0.131449100 container init 2a0e8e2148165aa8154ff575e0657da0d61273d7fe6b18803343158471a37d5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1764794109, io.buildah.version=1.41.4, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:40:43 localhost podman[278573]: 2026-02-01 09:40:43.700809571 +0000 UTC m=+0.044972672 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:40:43 localhost podman[278573]: 2026-02-01 09:40:43.802090871 +0000 UTC m=+0.146254012 container start 2a0e8e2148165aa8154ff575e0657da0d61273d7fe6b18803343158471a37d5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64) Feb 1 04:40:43 localhost bash[278573]: 2a0e8e2148165aa8154ff575e0657da0d61273d7fe6b18803343158471a37d5b Feb 1 04:40:43 localhost systemd[1]: Started Ceph mgr.np0005604212.oynhpm for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 04:40:43 localhost ceph-mgr[278591]: set uid:gid to 167:167 (ceph:ceph) Feb 1 04:40:43 localhost ceph-mgr[278591]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Feb 1 04:40:43 localhost ceph-mgr[278591]: pidfile_write: ignore empty --pid-file Feb 1 04:40:43 localhost ceph-mgr[278591]: mgr[py] Loading python module 'alerts' Feb 1 04:40:43 localhost ceph-mgr[278591]: mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 1 04:40:43 localhost ceph-mgr[278591]: mgr[py] Loading python module 'balancer' Feb 1 04:40:43 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:43.970+0000 7f91f75bc140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 1 04:40:44 localhost ceph-mgr[278591]: mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 1 04:40:44 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:44.036+0000 7f91f75bc140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 1 04:40:44 localhost ceph-mgr[278591]: mgr[py] Loading python module 'cephadm' Feb 1 04:40:44 localhost systemd[1]: tmp-crun.IbSYhC.mount: Deactivated successfully. Feb 1 04:40:44 localhost ceph-mgr[278591]: mgr[py] Loading python module 'crash' Feb 1 04:40:44 localhost ceph-mgr[278591]: mgr[py] Module crash has missing NOTIFY_TYPES member Feb 1 04:40:44 localhost ceph-mgr[278591]: mgr[py] Loading python module 'dashboard' Feb 1 04:40:44 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:44.669+0000 7f91f75bc140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'devicehealth' Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'diskprediction_local' Feb 1 04:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:45.228+0000 7f91f75bc140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Feb 1 04:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Feb 1 04:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: from numpy import show_config as show_numpy_config Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:45.397+0000 7f91f75bc140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'influx' Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Module influx has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'insights' Feb 1 04:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:45.460+0000 7f91f75bc140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'iostat' Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'k8sevents' Feb 1 04:40:45 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:45.585+0000 7f91f75bc140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 1 04:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:40:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 5806 writes, 25K keys, 5806 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5806 writes, 781 syncs, 7.43 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 105 writes, 348 keys, 105 commit groups, 1.0 writes per commit group, ingest: 0.56 MB, 0.00 MB/s#012Interval WAL: 105 writes, 41 syncs, 2.56 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'localpool' Feb 1 04:40:45 localhost ceph-mgr[278591]: mgr[py] Loading python module 'mds_autoscaler' Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'mirroring' Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'nfs' Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'orchestrator' Feb 1 04:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:46.346+0000 7f91f75bc140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost nova_compute[274651]: 2026-02-01 09:40:46.461 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:46 localhost nova_compute[274651]: 2026-02-01 09:40:46.463 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:46 localhost nova_compute[274651]: 2026-02-01 09:40:46.463 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:46 localhost nova_compute[274651]: 2026-02-01 09:40:46.463 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'osd_perf_query' Feb 1 04:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:46.495+0000 7f91f75bc140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost nova_compute[274651]: 2026-02-01 09:40:46.499 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:46 localhost nova_compute[274651]: 2026-02-01 09:40:46.500 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:46.564+0000 7f91f75bc140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'osd_support' Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'pg_autoscaler' Feb 1 04:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:46.624+0000 7f91f75bc140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'progress' Feb 1 04:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:46.701+0000 7f91f75bc140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost podman[278621]: 2026-02-01 09:40:46.727928657 +0000 UTC m=+0.084401875 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:40:46 localhost podman[278621]: 2026-02-01 09:40:46.73655606 +0000 UTC m=+0.093029278 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:40:46 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Module progress has missing NOTIFY_TYPES member Feb 1 04:40:46 localhost ceph-mgr[278591]: mgr[py] Loading python module 'prometheus' Feb 1 04:40:46 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:46.760+0000 7f91f75bc140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Loading python module 'rbd_support' Feb 1 04:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:47.053+0000 7f91f75bc140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Loading python module 'restful' Feb 1 04:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:47.132+0000 7f91f75bc140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Loading python module 'rgw' Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Loading python module 'rook' Feb 1 04:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:47.459+0000 7f91f75bc140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Module rook has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Loading python module 'selftest' Feb 1 04:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:47.921+0000 7f91f75bc140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 1 04:40:47 localhost ceph-mgr[278591]: mgr[py] Loading python module 'snap_schedule' Feb 1 04:40:47 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:47.984+0000 7f91f75bc140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'stats' Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'status' Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Module status has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'telegraf' Feb 1 04:40:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:48.177+0000 7f91f75bc140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'telemetry' Feb 1 04:40:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:48.237+0000 7f91f75bc140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'test_orchestrator' Feb 1 04:40:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:48.367+0000 7f91f75bc140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost sshd[278640]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:48.515+0000 7f91f75bc140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'volumes' Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'zabbix' Feb 1 04:40:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:48.706+0000 7f91f75bc140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:40:48.765+0000 7f91f75bc140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 1 04:40:48 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571d57471e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Feb 1 04:40:48 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1614340691 Feb 1 04:40:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:40:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:40:49 localhost systemd[1]: tmp-crun.V9hDoQ.mount: Deactivated successfully. Feb 1 04:40:49 localhost podman[278661]: 2026-02-01 09:40:49.073783135 +0000 UTC m=+0.131210541 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:40:49 localhost podman[278660]: 2026-02-01 09:40:49.081174121 +0000 UTC m=+0.146155108 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:40:49 localhost podman[278660]: 2026-02-01 09:40:49.116392986 +0000 UTC m=+0.181374003 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:40:49 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:40:49 localhost podman[278661]: 2026-02-01 09:40:49.140531961 +0000 UTC m=+0.197959397 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 1 04:40:49 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:40:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 4955 writes, 22K keys, 4955 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4955 writes, 713 syncs, 6.95 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 59 writes, 172 keys, 59 commit groups, 1.0 writes per commit group, ingest: 0.16 MB, 0.00 MB/s#012Interval WAL: 59 writes, 28 syncs, 2.11 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:40:50 localhost systemd[1]: tmp-crun.Zb4zdt.mount: Deactivated successfully. Feb 1 04:40:50 localhost podman[278819]: 2026-02-01 09:40:50.088093288 +0000 UTC m=+0.094036419 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, vcs-type=git, release=1764794109, build-date=2025-12-08T17:28:53Z) Feb 1 04:40:50 localhost podman[278819]: 2026-02-01 09:40:50.221539487 +0000 UTC m=+0.227482628 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_CLEAN=True, release=1764794109, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=) Feb 1 04:40:51 localhost nova_compute[274651]: 2026-02-01 09:40:51.501 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:51 localhost nova_compute[274651]: 2026-02-01 09:40:51.504 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:51 localhost nova_compute[274651]: 2026-02-01 09:40:51.505 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:51 localhost nova_compute[274651]: 2026-02-01 09:40:51.505 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:51 localhost nova_compute[274651]: 2026-02-01 09:40:51.527 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:51 localhost nova_compute[274651]: 2026-02-01 09:40:51.528 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:53 localhost podman[236886]: time="2026-02-01T09:40:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:40:53 localhost podman[236886]: @ - - [01/Feb/2026:09:40:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154214 "" "Go-http-client/1.1" Feb 1 04:40:54 localhost podman[236886]: @ - - [01/Feb/2026:09:40:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18271 "" "Go-http-client/1.1" Feb 1 04:40:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:40:55 localhost podman[279097]: 2026-02-01 09:40:55.884323179 +0000 UTC m=+0.095686630 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, release=1769056855, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 1 04:40:55 localhost podman[279097]: 2026-02-01 09:40:55.899500292 +0000 UTC m=+0.110863773 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:40:55 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:40:56 localhost nova_compute[274651]: 2026-02-01 09:40:56.529 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:56 localhost nova_compute[274651]: 2026-02-01 09:40:56.531 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:40:56 localhost nova_compute[274651]: 2026-02-01 09:40:56.531 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:40:56 localhost nova_compute[274651]: 2026-02-01 09:40:56.531 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:40:56 localhost nova_compute[274651]: 2026-02-01 09:40:56.561 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:40:56 localhost nova_compute[274651]: 2026-02-01 09:40:56.562 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:01 localhost openstack_network_exporter[239441]: ERROR 09:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:41:01 localhost openstack_network_exporter[239441]: Feb 1 04:41:01 localhost openstack_network_exporter[239441]: ERROR 09:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:41:01 localhost openstack_network_exporter[239441]: Feb 1 04:41:01 localhost nova_compute[274651]: 2026-02-01 09:41:01.607 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:41:01 localhost podman[279740]: 2026-02-01 09:41:01.728158613 +0000 UTC m=+0.088302015 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 1 04:41:01 localhost podman[279740]: 2026-02-01 09:41:01.740296622 +0000 UTC m=+0.100440074 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:41:01 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.526 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.526 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.532 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4647b7c3-98ec-4d28-b319-d4316510633f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.526729', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fdaf2ec-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': 'db1fc756181c1e17ab7f2fb140d9083b189489bee57e7de93af8023a445d58c8'}]}, 'timestamp': '2026-02-01 09:41:03.533315', '_unique_id': 'e608b16f88e649389ee74b90fed44af6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.535 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.536 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.536 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a80bb06-756e-42da-8986-0f4008213627', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.536549', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fdb8cde-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': '33505976614bdb2ff6bb4369b0fa6e4bed3c5528ba3d8f090d01d381f0689c68'}]}, 'timestamp': '2026-02-01 09:41:03.537171', '_unique_id': '828e9d494a354289a4efebccf42c9f11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.539 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.539 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.570 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.571 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9148aebd-429b-4cd4-b3cb-a57560010708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.539735', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1fe0bdee-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '5c2a8e6586f6477c314aa7cef6d8ceaf219ed13e93252604858e4cf4cfaa818b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.539735', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1fe0d34c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '6906f0a36fc3431c26bfcc6b7c9c8358a18346dc25142f0c7ec3dde35db54cdf'}]}, 'timestamp': '2026-02-01 09:41:03.571668', '_unique_id': '8d117222feb14ca99ce6d9c7f3d5adba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.572 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.574 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.574 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.574 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ca1142c-2831-40f4-875a-dede8e421c73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.574557', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fe156d2-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': '4abd4595681ce22a02058700acae8bc4ebf296e102b40bc4b2f3def849dc91c6'}]}, 'timestamp': '2026-02-01 09:41:03.575086', '_unique_id': '4e86fcceecc941768e0c9af4fdc20dc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.577 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '210db102-6a5c-417f-80e1-29e04d7edaa3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.577247', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fe1bf50-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': '43871754d2d14ac53423ee974609208de8510087e95b4a5d8753469cb6b1bcd4'}]}, 'timestamp': '2026-02-01 09:41:03.577760', '_unique_id': 'c3188aba36a147ada5eded53bf548b07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.578 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aa1f0ef-2087-48f0-9b2a-4e6507056d75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.580039', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fe22cb0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': 'c39d40156fc6d16769f8d37a5b1847d3ad6c1e9ed5743f683ab09663abcdc5db'}]}, 'timestamp': '2026-02-01 09:41:03.580521', '_unique_id': '0c2eae8e1e1d4e5d81fabfd393430070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.582 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.594 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.595 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b4424b3-b7ae-46a5-9506-fce82c34555f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.582898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1fe462fa-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.802391524, 'message_signature': 'bc48f76a3dde087b2cb3654368d0ee0784baa77c74f15588f217198371318441'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.582898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1fe4765a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.802391524, 'message_signature': '9711d3ffe175c9f4bf303ca827d476ee4bf42301d4482b8faf61d7e09c76fee8'}]}, 'timestamp': '2026-02-01 09:41:03.595489', '_unique_id': '421cdc1e61094d4cb1366bac0edddb5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.597 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.598 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e813ae5c-1743-410c-ab49-31fc857f4d8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:41:03.598273', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1fe7820a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.834191114, 'message_signature': '6cb28a5a0f8073734cc05c9ca14daab32a6650abd40565b71100cb7905aca39d'}]}, 'timestamp': '2026-02-01 09:41:03.615464', '_unique_id': 'e73694bafa8c44368d05c6c2f09b1645'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 11140000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3035b89-aec1-4ed3-ae11-3157a31da2e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11140000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:41:03.617718', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1fe7ec90-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.834191114, 'message_signature': 'c6dea2de1305d7ef9218d3c8cbe8c3e79a48a45e2fdb7ef023d8ce94f79143a5'}]}, 'timestamp': '2026-02-01 09:41:03.618219', '_unique_id': 'b8a0b9d0ba064631a6d6ae71593fed33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.620 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.620 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23e22ac6-5b08-4408-886f-e241268096bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.620460', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1fe857a2-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '0d38ddeac6dbfcc412cc2aac5bebeb9e9de6ac1af3a7d69e58330a4359b3c2c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.620460', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1fe86a6c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '3b4760cde7cc79247961a5dced0ba493c00befd0851c876fc6f6b1b80f119d9f'}]}, 'timestamp': '2026-02-01 09:41:03.621402', '_unique_id': '6f43ca832c104b42bcec71595c1599d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.623 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.623 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9320bcc1-93fd-49a7-84ec-c7ded4873732', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.623699', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fe8d60a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': '8f2e4bbe2d85c52e445ac92faaffbc4184d15f121761bd788e18b16627ec55c2'}]}, 'timestamp': '2026-02-01 09:41:03.624219', '_unique_id': '88f2b868df804e5ab577b56c155ba574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '073eba0b-9834-42de-9373-7837c360adab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.626434', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fe940d6-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': '4c14a139e4cc24de11ce3e9f298d50e60585ea92ffb95f414831b3208475b82e'}]}, 'timestamp': '2026-02-01 09:41:03.626920', '_unique_id': '370f2fc415b2464694c3bbb7c43f1a75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6905f91e-99d8-43ce-8a3d-bc4c5e004579', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.629175', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1fe9abca-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '14ce56fd801acda6d335e867a09e65b3013b40eeefaa345ae59c630ed3b0799a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.629175', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1fe9be1c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '0c79c675160cddd0c937ca3f00542412620f090c53f9c4c127e41b4d67324aae'}]}, 'timestamp': '2026-02-01 09:41:03.630126', '_unique_id': 'a4ce953d64874bdd98acbeaeb9e541e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.633 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ccc2438-5edb-46d1-82c1-170bb298ea72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.632553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1fea2fb4-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.802391524, 'message_signature': 'e6a647e51798107657b7a7db193997a5dd2850a326284953320f49da48d03533'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.632553', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1fea41ca-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.802391524, 'message_signature': '0f6d044ffde8d86bd6e9132f95c4e7c3147fd7345014808e11280823ef17beca'}]}, 'timestamp': '2026-02-01 09:41:03.633459', '_unique_id': '513c69549234441ba6d496dcf4aa8fae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.635 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45b4b4c6-5c1d-4e0a-bf45-c95160e38772', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.635834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1feab146-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '25282a49dac8a7358fe43d54b4be5ba4a6652d21d79dce5c6f3e60837fd6a5dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.635834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1feacb9a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '3877cfb98944428512c23dabf05ccd9c6dcb5dd50fa8522722a1ef1d38341569'}]}, 'timestamp': '2026-02-01 09:41:03.637021', '_unique_id': '5cd2c1cc66b54cb8b787a2b111633416'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '991f5dae-4453-4e0f-9336-313b0ac3057d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.639252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1feb353a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '96bbb3d3dfb7467cd38455b829e03d4103732268040eccf48fce2e334b467034'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.639252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1feb45f2-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '3caff318d89730802444561d2b0f3ff020af45eac085d4220d5b477f0ac595e9'}]}, 'timestamp': '2026-02-01 09:41:03.640154', '_unique_id': '153ad16676a344ff827e2ac10e18febe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.642 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b04e974-3d44-4b7e-89eb-9e3415363d36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.642449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1febb208-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.802391524, 'message_signature': 'a200963f578d18f68b38d445a4ef70b2d197fdc24602af32735b8b149c7b0491'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.642449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1febc40a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.802391524, 'message_signature': '910d3b598660e4e7b1e531c1de3bf6e9f2022add8cd0f1dd99e0c67d021a8470'}]}, 'timestamp': '2026-02-01 09:41:03.643349', '_unique_id': '00a845713d2442c980b6bea5fbd4cf4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.645 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '736b76a4-1a41-4bf8-a5ca-48fef9011264', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.645540', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fec2c38-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': 'f84f4d67cfdebff4485194f50f3cd8df59135c692124f54d6dfde7e3be63fc53'}]}, 'timestamp': '2026-02-01 09:41:03.646078', '_unique_id': '272cc0effde449638b33b8e181edff0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.647 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.647 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a37b013-7d87-4fcc-b2dc-6377adc238f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.647660', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1fec86ec-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': 'f8b8aed871a91efe38e4815d07b32000a1e39c862ea8f1a283b783dbefae6751'}]}, 'timestamp': '2026-02-01 09:41:03.648281', '_unique_id': 'fe60e8cd1c634cabaf1dd2371e45191e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.649 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a218a365-24b9-4f66-be8e-ba9a9b432acc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:41:03.649661', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1feccb7a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.74617077, 'message_signature': '014c2d2e544240a4a9a2525cede3c2deb2890c1ef65734bd75ee4dea35075a0e'}]}, 'timestamp': '2026-02-01 09:41:03.650137', '_unique_id': '1e0ae46a0d404983b2200405d339705e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.651 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.651 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8205c656-f5d4-486c-96a6-eecc1611966d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:41:03.651569', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1fed1300-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '1148b37e4fb1c24573295cdb2613e5a3e63b828a116be30302e61ec1e90641d7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:41:03.651569', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1fed1d1e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 10957.759200337, 'message_signature': '4921c43b4ebf0bec34f8435140855c849fac1455fedcbe13c2af02bd36070832'}]}, 'timestamp': '2026-02-01 09:41:03.652128', '_unique_id': '6238c61217814115b1b48caa5c479b82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:41:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:41:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:41:04 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571d57471e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Feb 1 04:41:06 localhost nova_compute[274651]: 2026-02-01 09:41:06.610 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:06 localhost nova_compute[274651]: 2026-02-01 09:41:06.612 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:06 localhost nova_compute[274651]: 2026-02-01 09:41:06.612 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:06 localhost nova_compute[274651]: 2026-02-01 09:41:06.612 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:06 localhost nova_compute[274651]: 2026-02-01 09:41:06.658 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:06 localhost nova_compute[274651]: 2026-02-01 09:41:06.659 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:09 localhost systemd[1]: Starting dnf makecache... Feb 1 04:41:10 localhost podman[279838]: Feb 1 04:41:10 localhost podman[279838]: 2026-02-01 09:41:10.155778141 +0000 UTC m=+0.062367883 container create dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_rosalind, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , ceph=True, description=Red Hat Ceph Storage 7, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:41:10 localhost systemd[1]: Started libpod-conmon-dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05.scope. Feb 1 04:41:10 localhost systemd[1]: Started libcrun container. Feb 1 04:41:10 localhost podman[279838]: 2026-02-01 09:41:10.128443577 +0000 UTC m=+0.035033329 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:10 localhost podman[279838]: 2026-02-01 09:41:10.231580512 +0000 UTC m=+0.138170244 container init dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_rosalind, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.expose-services=, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:41:10 localhost systemd[1]: tmp-crun.IdYVWa.mount: Deactivated successfully. Feb 1 04:41:10 localhost podman[279838]: 2026-02-01 09:41:10.250483449 +0000 UTC m=+0.157073181 container start dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_rosalind, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, RELEASE=main, release=1764794109, vcs-type=git, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:41:10 localhost podman[279838]: 2026-02-01 09:41:10.250811799 +0000 UTC m=+0.157401531 container attach dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_rosalind, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1764794109, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:41:10 localhost systemd[1]: libpod-dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05.scope: Deactivated successfully. Feb 1 04:41:10 localhost funny_rosalind[279852]: 167 167 Feb 1 04:41:10 localhost podman[279838]: 2026-02-01 09:41:10.258113281 +0000 UTC m=+0.164703063 container died dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_rosalind, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git) Feb 1 04:41:10 localhost dnf[279810]: Updating Subscription Management repositories. Feb 1 04:41:10 localhost dnf[279810]: Unable to read consumer identity Feb 1 04:41:10 localhost dnf[279810]: This system is not registered with an entitlement server. You can use subscription-manager to register. Feb 1 04:41:10 localhost podman[279857]: 2026-02-01 09:41:10.339758292 +0000 UTC m=+0.075117702 container remove dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_rosalind, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, build-date=2025-12-08T17:28:53Z, ceph=True, version=7, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Feb 1 04:41:10 localhost dnf[279810]: Metadata cache refreshed recently. Feb 1 04:41:10 localhost systemd[1]: libpod-conmon-dc7878845c14ea1acafe147d1a8993164ca56a152eccb206d60bd9abdec75d05.scope: Deactivated successfully. Feb 1 04:41:10 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Feb 1 04:41:10 localhost systemd[1]: Finished dnf makecache. Feb 1 04:41:10 localhost podman[279873]: Feb 1 04:41:10 localhost podman[279873]: 2026-02-01 09:41:10.433559882 +0000 UTC m=+0.055831194 container create 0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_germain, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., RELEASE=main, version=7, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:41:10 localhost systemd[1]: Started libpod-conmon-0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a.scope. Feb 1 04:41:10 localhost systemd[1]: Started libcrun container. Feb 1 04:41:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/137c0e318cb359bc29595c0d0be99972b1e51373ffbb77d2b9a02614884a74d1/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/137c0e318cb359bc29595c0d0be99972b1e51373ffbb77d2b9a02614884a74d1/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/137c0e318cb359bc29595c0d0be99972b1e51373ffbb77d2b9a02614884a74d1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/137c0e318cb359bc29595c0d0be99972b1e51373ffbb77d2b9a02614884a74d1/merged/var/lib/ceph/mon/ceph-np0005604212 supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:10 localhost podman[279873]: 2026-02-01 09:41:10.503129494 +0000 UTC m=+0.125400796 container init 0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_germain, GIT_CLEAN=True, version=7, name=rhceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109) Feb 1 04:41:10 localhost podman[279873]: 2026-02-01 09:41:10.411735556 +0000 UTC m=+0.034006868 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:10 localhost podman[279873]: 2026-02-01 09:41:10.521538146 +0000 UTC m=+0.143809478 container start 0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_germain, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, version=7, ceph=True, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64) Feb 1 04:41:10 localhost podman[279873]: 2026-02-01 09:41:10.522200836 +0000 UTC m=+0.144472208 container attach 0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_germain, distribution-scope=public, io.openshift.expose-services=, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7) Feb 1 04:41:10 localhost podman[279873]: 2026-02-01 09:41:10.624941578 +0000 UTC m=+0.247212930 container died 0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_germain, ceph=True, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph) Feb 1 04:41:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:41:10 localhost systemd[1]: libpod-0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a.scope: Deactivated successfully. Feb 1 04:41:10 localhost podman[279916]: 2026-02-01 09:41:10.729342023 +0000 UTC m=+0.085126368 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:41:10 localhost podman[279916]: 2026-02-01 09:41:10.76859919 +0000 UTC m=+0.124383545 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:41:10 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:41:10 localhost podman[279915]: 2026-02-01 09:41:10.845711771 +0000 UTC m=+0.202827756 container remove 0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_germain, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, release=1764794109, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux ) Feb 1 04:41:10 localhost systemd[1]: libpod-conmon-0e2520ca64f33944f0960def69fe46f49f1fe20722761583f11b0a84b2f52c7a.scope: Deactivated successfully. Feb 1 04:41:10 localhost systemd[1]: Reloading. Feb 1 04:41:10 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571d5746f20 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Feb 1 04:41:11 localhost systemd-rc-local-generator[279977]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:41:11 localhost systemd-sysv-generator[279982]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: var-lib-containers-storage-overlay-40d2023676145003f3a2c19cb20308a4b6713777810188a9e82ebfce6b9e8ef6-merged.mount: Deactivated successfully. Feb 1 04:41:11 localhost systemd[1]: Reloading. Feb 1 04:41:11 localhost nova_compute[274651]: 2026-02-01 09:41:11.660 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:11 localhost nova_compute[274651]: 2026-02-01 09:41:11.664 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:11 localhost nova_compute[274651]: 2026-02-01 09:41:11.664 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:11 localhost nova_compute[274651]: 2026-02-01 09:41:11.665 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:11 localhost nova_compute[274651]: 2026-02-01 09:41:11.690 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:11 localhost nova_compute[274651]: 2026-02-01 09:41:11.691 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:11 localhost systemd-rc-local-generator[280025]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:41:11 localhost systemd-sysv-generator[280028]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:41:11 localhost systemd[1]: Starting Ceph mon.np0005604212 for 33fac0b9-80c7-560f-918a-c92d3021ca1e... Feb 1 04:41:12 localhost podman[280081]: Feb 1 04:41:12 localhost podman[280081]: 2026-02-01 09:41:12.311065188 +0000 UTC m=+0.066961373 container create 8e57e0bb971232fdfcc63e8a0793c2c5695b574e6782300f0926374dc6f8460f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, distribution-scope=public, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Feb 1 04:41:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/184609069e024ddd5f317f7b50559e6dcd8146dbca3c28ede35f3bafe047c425/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/184609069e024ddd5f317f7b50559e6dcd8146dbca3c28ede35f3bafe047c425/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/184609069e024ddd5f317f7b50559e6dcd8146dbca3c28ede35f3bafe047c425/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/184609069e024ddd5f317f7b50559e6dcd8146dbca3c28ede35f3bafe047c425/merged/var/lib/ceph/mon/ceph-np0005604212 supports timestamps until 2038 (0x7fffffff) Feb 1 04:41:12 localhost podman[280081]: 2026-02-01 09:41:12.362186328 +0000 UTC m=+0.118082513 container init 8e57e0bb971232fdfcc63e8a0793c2c5695b574e6782300f0926374dc6f8460f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, architecture=x86_64, RELEASE=main) Feb 1 04:41:12 localhost podman[280081]: 2026-02-01 09:41:12.371943035 +0000 UTC m=+0.127839210 container start 8e57e0bb971232fdfcc63e8a0793c2c5695b574e6782300f0926374dc6f8460f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4) Feb 1 04:41:12 localhost bash[280081]: 8e57e0bb971232fdfcc63e8a0793c2c5695b574e6782300f0926374dc6f8460f Feb 1 04:41:12 localhost podman[280081]: 2026-02-01 09:41:12.278577658 +0000 UTC m=+0.034473863 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:12 localhost systemd[1]: Started Ceph mon.np0005604212 for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 04:41:12 localhost ceph-mon[280099]: set uid:gid to 167:167 (ceph:ceph) Feb 1 04:41:12 localhost ceph-mon[280099]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Feb 1 04:41:12 localhost ceph-mon[280099]: pidfile_write: ignore empty --pid-file Feb 1 04:41:12 localhost ceph-mon[280099]: load: jerasure load: lrc Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: RocksDB version: 7.9.2 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Git sha 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Compile date 2025-09-23 00:00:00 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: DB SUMMARY Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: DB Session ID: QKHSW2QZX39H1UYTI42X Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: CURRENT file: CURRENT Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: IDENTITY file: IDENTITY Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005604212/store.db dir, Total Num: 0, files: Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005604212/store.db: 000004.log size: 886 ; Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.error_if_exists: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.create_if_missing: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.paranoid_checks: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.env: 0x55905ebb69e0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.fs: PosixFileSystem Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.info_log: 0x55906091ad20 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_file_opening_threads: 16 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.statistics: (nil) Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.use_fsync: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_log_file_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.log_file_time_to_roll: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.keep_log_file_num: 1000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.recycle_log_file_num: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.allow_fallocate: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.allow_mmap_reads: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.allow_mmap_writes: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.use_direct_reads: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.create_missing_column_families: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.db_log_dir: Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.wal_dir: Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.table_cache_numshardbits: 6 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.advise_random_on_open: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.db_write_buffer_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.write_buffer_manager: 0x55906092b540 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.use_adaptive_mutex: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.rate_limiter: (nil) Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.wal_recovery_mode: 2 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.enable_thread_tracking: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.enable_pipelined_write: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.unordered_write: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.row_cache: None Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.wal_filter: None Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.allow_ingest_behind: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.two_write_queues: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.manual_wal_flush: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.wal_compression: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.atomic_flush: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.persist_stats_to_disk: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.log_readahead_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.best_efforts_recovery: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.allow_data_in_errors: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.db_host_id: __hostname__ Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.enforce_single_del_contracts: true Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_background_jobs: 2 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_background_compactions: -1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_subcompactions: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.delayed_write_rate : 16777216 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_total_wal_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.stats_dump_period_sec: 600 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.stats_persist_period_sec: 600 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_open_files: -1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bytes_per_sync: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_readahead_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_background_flushes: -1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Compression algorithms supported: Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kZSTD supported: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kXpressCompression supported: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kBZip2Compression supported: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kLZ4Compression supported: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kZlibCompression supported: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: #011kSnappyCompression supported: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: DMutex implementation: pthread_mutex_t Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005604212/store.db/MANIFEST-000005 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.merge_operator: Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_filter: None Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_filter_factory: None Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.sst_partitioner_factory: None Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55906091a980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x559060917350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.write_buffer_size: 33554432 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_write_buffer_number: 2 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression: NoCompression Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression: Disabled Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.prefix_extractor: nullptr Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.num_levels: 7 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.level: 32767 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.enabled: false Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_base: 268435456 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.arena_block_size: 1048576 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.table_properties_collectors: Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.inplace_update_support: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.bloom_locality: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.max_successive_merges: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.force_consistency_checks: 1 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.ttl: 2592000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.enable_blob_files: false Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.min_blob_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.blob_file_size: 268435456 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005604212/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 299b1ac1-027d-4648-ba6b-b2681374d96d Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938872427727, "job": 1, "event": "recovery_started", "wal_files": [4]} Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938872430132, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938872, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "299b1ac1-027d-4648-ba6b-b2681374d96d", "db_session_id": "QKHSW2QZX39H1UYTI42X", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938872430263, "job": 1, "event": "recovery_finished"} Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55906093ee00 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: DB pointer 0x559060a34000 Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:41:12 localhost ceph-mon[280099]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x559060917350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212 does not exist in monmap, will attempt to join an existing cluster Feb 1 04:41:12 localhost ceph-mon[280099]: using public_addr v2:172.18.0.106:0/0 -> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] Feb 1 04:41:12 localhost ceph-mon[280099]: starting mon.np0005604212 rank -1 at public addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] at bind addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005604212 fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(???) e0 preinit fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing) e5 sync_obtain_latest_monmap Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing) e5 sync_obtain_latest_monmap obtained monmap e5 Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing).mds e16 new map Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing).mds e16 print_map#012e16#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-01T07:59:04.480309+0000#012modified#0112026-02-01T09:39:55.510678+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01179#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26329}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26329 members: 26329#012[mds.mds.np0005604212.tkdkxt{0:26329} state up:active seq 12 addr [v2:172.18.0.106:6808/1133321306,v1:172.18.0.106:6809/1133321306] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005604215.rwvxvg{-1:16872} state up:standby seq 1 addr [v2:172.18.0.108:6808/2262553558,v1:172.18.0.108:6809/2262553558] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005604213.jdbvyh{-1:16878} state up:standby seq 1 addr [v2:172.18.0.107:6808/3323601884,v1:172.18.0.107:6809/3323601884] compat {c=[1],r=[1],i=[17ff]}] Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing).osd e81 crush map has features 3314933000852226048, adjusting msgr requires Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires Feb 1 04:41:12 localhost ceph-mon[280099]: Removing key for mds.mds.np0005604210.yulljq Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005604210.yulljq"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005604210.yulljq"}]': finished Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Removing daemon mds.mds.np0005604211.ggsxcc from np0005604211.localdomain -- ports [] Feb 1 04:41:12 localhost ceph-mon[280099]: Removing key for mds.mds.np0005604211.ggsxcc Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005604211.ggsxcc"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005604211.ggsxcc"}]': finished Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mgr to host np0005604212.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mgr to host np0005604213.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mgr to host np0005604215.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Saving service mgr spec with placement label:mgr Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 1 04:41:12 localhost ceph-mon[280099]: Deploying daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 1 04:41:12 localhost ceph-mon[280099]: Deploying daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mon to host np0005604209.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label _admin to host np0005604209.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 1 04:41:12 localhost ceph-mon[280099]: Deploying daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mon to host np0005604210.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label _admin to host np0005604210.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mon to host np0005604211.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label _admin to host np0005604211.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mon to host np0005604212.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label _admin to host np0005604212.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mon to host np0005604213.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label _admin to host np0005604213.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label mon to host np0005604215.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Added label _admin to host np0005604215.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: Saving service mon spec with placement label:mon Feb 1 04:41:12 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: Deploying daemon mon.np0005604215 on np0005604215.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: Deploying daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604209 calling monitor election Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604211 calling monitor election Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604210 calling monitor election Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604215 calling monitor election Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604209 is new leader, mons np0005604209,np0005604211,np0005604210,np0005604215 in quorum (ranks 0,1,2,3) Feb 1 04:41:12 localhost ceph-mon[280099]: overall HEALTH_OK Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:12 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:41:12 localhost ceph-mon[280099]: mon.np0005604212@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Feb 1 04:41:16 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571d5747600 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Feb 1 04:41:16 localhost nova_compute[274651]: 2026-02-01 09:41:16.692 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:16 localhost nova_compute[274651]: 2026-02-01 09:41:16.694 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:16 localhost nova_compute[274651]: 2026-02-01 09:41:16.695 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:16 localhost nova_compute[274651]: 2026-02-01 09:41:16.695 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:16 localhost nova_compute[274651]: 2026-02-01 09:41:16.740 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:16 localhost nova_compute[274651]: 2026-02-01 09:41:16.741 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:41:16 localhost podman[280174]: 2026-02-01 09:41:16.898656551 +0000 UTC m=+0.071178422 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:41:16 localhost podman[280174]: 2026-02-01 09:41:16.907415558 +0000 UTC m=+0.079937379 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:41:16 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:41:17 localhost podman[280281]: 2026-02-01 09:41:17.686039843 +0000 UTC m=+0.090418039 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, release=1764794109, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:41:17 localhost podman[280281]: 2026-02-01 09:41:17.81320115 +0000 UTC m=+0.217579396 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109) Feb 1 04:41:18 localhost ceph-mon[280099]: mon.np0005604212@-1(probing) e6 my rank is now 5 (was -1) Feb 1 04:41:18 localhost ceph-mon[280099]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:41:18 localhost ceph-mon[280099]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 Feb 1 04:41:18 localhost ceph-mon[280099]: mon.np0005604212@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:41:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:41:19 localhost systemd[1]: tmp-crun.sT4YH5.mount: Deactivated successfully. Feb 1 04:41:19 localhost podman[280403]: 2026-02-01 09:41:19.73080828 +0000 UTC m=+0.089299675 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:41:19 localhost podman[280402]: 2026-02-01 09:41:19.770869021 +0000 UTC m=+0.132299635 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:41:19 localhost podman[280402]: 2026-02-01 09:41:19.78131604 +0000 UTC m=+0.142746654 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:41:19 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:41:19 localhost podman[280403]: 2026-02-01 09:41:19.839151634 +0000 UTC m=+0.197643009 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller) Feb 1 04:41:19 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604209 calling monitor election Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604211 calling monitor election Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604210 calling monitor election Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604215 calling monitor election Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604213 calling monitor election Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604209 is new leader, mons np0005604209,np0005604211,np0005604210,np0005604215,np0005604213 in quorum (ranks 0,1,2,3,4) Feb 1 04:41:21 localhost ceph-mon[280099]: overall HEALTH_OK Feb 1 04:41:21 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:21 localhost ceph-mon[280099]: mon.np0005604212@5(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:21 localhost ceph-mon[280099]: mgrc update_daemon_metadata mon.np0005604212 metadata {addrs=[v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005604212.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005604212.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Feb 1 04:41:21 localhost nova_compute[274651]: 2026-02-01 09:41:21.771 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:21 localhost nova_compute[274651]: 2026-02-01 09:41:21.773 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:21 localhost nova_compute[274651]: 2026-02-01 09:41:21.773 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:21 localhost nova_compute[274651]: 2026-02-01 09:41:21.774 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:21 localhost nova_compute[274651]: 2026-02-01 09:41:21.774 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:21 localhost nova_compute[274651]: 2026-02-01 09:41:21.777 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604215 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604211 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604210 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604213 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604212 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604215 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604211 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604210 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: Health check failed: 2/6 mons down, quorum np0005604209,np0005604211,np0005604210,np0005604215 (MON_DOWN) Feb 1 04:41:22 localhost ceph-mon[280099]: overall HEALTH_OK Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604209 calling monitor election Feb 1 04:41:22 localhost ceph-mon[280099]: mon.np0005604209 is new leader, mons np0005604209,np0005604211,np0005604210,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3,4,5) Feb 1 04:41:22 localhost ceph-mon[280099]: Health check cleared: MON_DOWN (was: 2/6 mons down, quorum np0005604209,np0005604211,np0005604210,np0005604215) Feb 1 04:41:22 localhost ceph-mon[280099]: Cluster is now healthy Feb 1 04:41:22 localhost ceph-mon[280099]: overall HEALTH_OK Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:22 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:22 localhost ceph-mon[280099]: Updating np0005604209.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:22 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:22 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:22 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:22 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:22 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:23 localhost podman[236886]: time="2026-02-01T09:41:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:41:23 localhost podman[236886]: @ - - [01/Feb/2026:09:41:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:41:24 localhost podman[236886]: @ - - [01/Feb/2026:09:41:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18776 "" "Go-http-client/1.1" Feb 1 04:41:24 localhost ceph-mon[280099]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:24 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:24 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:24 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:24 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:24 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:24 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:25 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:25 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:25 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:25 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:25 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:41:26 localhost ceph-mon[280099]: Reconfiguring mon.np0005604209 (monmap changed)... Feb 1 04:41:26 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604209 on np0005604209.localdomain Feb 1 04:41:26 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:26 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:26 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604209.isqrps", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:41:26 localhost systemd[1]: tmp-crun.4AKOx1.mount: Deactivated successfully. Feb 1 04:41:26 localhost podman[280788]: 2026-02-01 09:41:26.739967771 +0000 UTC m=+0.100353251 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=9.7, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1769056855, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, config_id=openstack_network_exporter) Feb 1 04:41:26 localhost podman[280788]: 2026-02-01 09:41:26.755779924 +0000 UTC m=+0.116165364 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7) Feb 1 04:41:26 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:41:26 localhost nova_compute[274651]: 2026-02-01 09:41:26.778 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:26 localhost nova_compute[274651]: 2026-02-01 09:41:26.780 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:26 localhost nova_compute[274651]: 2026-02-01 09:41:26.780 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:26 localhost nova_compute[274651]: 2026-02-01 09:41:26.780 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:26 localhost nova_compute[274651]: 2026-02-01 09:41:26.812 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:26 localhost nova_compute[274651]: 2026-02-01 09:41:26.813 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:27 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604209.isqrps (monmap changed)... Feb 1 04:41:27 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604209.isqrps on np0005604209.localdomain Feb 1 04:41:27 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:27 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:27 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:27 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604209.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:28 localhost ceph-mon[280099]: Reconfiguring crash.np0005604209 (monmap changed)... Feb 1 04:41:28 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604209 on np0005604209.localdomain Feb 1 04:41:28 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:28 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:28 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:29 localhost ceph-mon[280099]: Reconfiguring crash.np0005604210 (monmap changed)... Feb 1 04:41:29 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain Feb 1 04:41:29 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:29 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:29 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:41:30 localhost ceph-mon[280099]: Reconfiguring mon.np0005604210 (monmap changed)... Feb 1 04:41:30 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain Feb 1 04:41:30 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:30 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:41:30 localhost ceph-mon[280099]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:30 localhost ceph-mon[280099]: mon.np0005604212@5(peon).osd e81 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Feb 1 04:41:30 localhost ceph-mon[280099]: mon.np0005604212@5(peon).osd e81 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Feb 1 04:41:30 localhost ceph-mon[280099]: mon.np0005604212@5(peon).osd e82 e82: 6 total, 6 up, 6 in Feb 1 04:41:30 localhost systemd[1]: session-22.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd[1]: session-17.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd[1]: session-26.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd[1]: session-26.scope: Consumed 3min 22.319s CPU time. Feb 1 04:41:30 localhost systemd[1]: session-16.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd[1]: session-25.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd[1]: session-24.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd-logind[759]: Session 24 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Session 16 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Session 22 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Session 26 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Session 25 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Session 17 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd[1]: session-18.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd[1]: session-21.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd-logind[759]: Session 21 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd[1]: session-14.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd-logind[759]: Session 18 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd[1]: session-19.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd-logind[759]: Session 14 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Session 19 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 22. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 17. Feb 1 04:41:30 localhost systemd[1]: session-20.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd[1]: session-23.scope: Deactivated successfully. Feb 1 04:41:30 localhost systemd-logind[759]: Session 23 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Session 20 logged out. Waiting for processes to exit. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 26. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 16. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 25. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 24. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 18. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 21. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 14. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 19. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 20. Feb 1 04:41:30 localhost systemd-logind[759]: Removed session 23. Feb 1 04:41:30 localhost sshd[280808]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:41:30 localhost systemd-logind[759]: New session 64 of user ceph-admin. Feb 1 04:41:30 localhost systemd[1]: Started Session 64 of User ceph-admin. Feb 1 04:41:31 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:41:31 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:41:31 localhost ceph-mon[280099]: from='client.? 172.18.0.103:0/3887042624' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:41:31 localhost ceph-mon[280099]: Activating manager daemon np0005604211.cuflqz Feb 1 04:41:31 localhost ceph-mon[280099]: from='client.? 172.18.0.103:0/3887042624' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:41:31 localhost ceph-mon[280099]: Manager daemon np0005604211.cuflqz is now available Feb 1 04:41:31 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch Feb 1 04:41:31 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch Feb 1 04:41:31 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch Feb 1 04:41:31 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch Feb 1 04:41:31 localhost openstack_network_exporter[239441]: ERROR 09:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:41:31 localhost openstack_network_exporter[239441]: Feb 1 04:41:31 localhost openstack_network_exporter[239441]: ERROR 09:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:41:31 localhost openstack_network_exporter[239441]: Feb 1 04:41:31 localhost nova_compute[274651]: 2026-02-01 09:41:31.812 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:31 localhost nova_compute[274651]: 2026-02-01 09:41:31.815 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:31 localhost podman[280916]: 2026-02-01 09:41:31.995091042 +0000 UTC m=+0.083632802 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, CEPH_POINT_RELEASE=, release=1764794109, com.redhat.component=rhceph-container, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , vcs-type=git, vendor=Red Hat, Inc., version=7, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:41:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:41:32 localhost systemd[1]: tmp-crun.iv6od7.mount: Deactivated successfully. Feb 1 04:41:32 localhost podman[280934]: 2026-02-01 09:41:32.115271787 +0000 UTC m=+0.099545547 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 1 04:41:32 localhost podman[280916]: 2026-02-01 09:41:32.144595201 +0000 UTC m=+0.233136961 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.openshift.tags=rhceph ceph, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Feb 1 04:41:32 localhost podman[280934]: 2026-02-01 09:41:32.158078542 +0000 UTC m=+0.142352302 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Feb 1 04:41:32 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:41:32 localhost ceph-mon[280099]: mon.np0005604212@5(peon).osd e82 _set_new_cache_sizes cache_size:1019731223 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:41:32 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:32 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: [01/Feb/2026:09:41:32] ENGINE Bus STARTING Feb 1 04:41:33 localhost ceph-mon[280099]: [01/Feb/2026:09:41:32] ENGINE Serving on https://172.18.0.105:7150 Feb 1 04:41:33 localhost ceph-mon[280099]: [01/Feb/2026:09:41:32] ENGINE Client ('172.18.0.105', 36928) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:41:33 localhost ceph-mon[280099]: [01/Feb/2026:09:41:32] ENGINE Serving on http://172.18.0.105:8765 Feb 1 04:41:33 localhost ceph-mon[280099]: [01/Feb/2026:09:41:32] ENGINE Bus STARTED Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:33 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604209", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604209", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:41:35 localhost ceph-mon[280099]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:41:35 localhost ceph-mon[280099]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:41:35 localhost ceph-mon[280099]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:41:35 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:41:35 localhost ceph-mon[280099]: Updating np0005604209.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:35 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:35 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:35 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:35 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:35 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:41:36 localhost ceph-mon[280099]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:36 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:36 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:36 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:36 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:36 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:41:36 localhost nova_compute[274651]: 2026-02-01 09:41:36.817 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:37 localhost ceph-mon[280099]: mon.np0005604212@5(peon).osd e82 _set_new_cache_sizes cache_size:1020047763 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604209.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:37 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:38 localhost nova_compute[274651]: 2026-02-01 09:41:38.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:38 localhost nova_compute[274651]: 2026-02-01 09:41:38.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:38 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:38 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:39 localhost nova_compute[274651]: 2026-02-01 09:41:39.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:39 localhost nova_compute[274651]: 2026-02-01 09:41:39.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:39 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:41:39 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:41:39 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:39 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:39 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:41:40 localhost nova_compute[274651]: 2026-02-01 09:41:40.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:40 localhost nova_compute[274651]: 2026-02-01 09:41:40.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:41:40 localhost nova_compute[274651]: 2026-02-01 09:41:40.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:41:40 localhost ceph-mon[280099]: Reconfiguring mon.np0005604211 (monmap changed)... Feb 1 04:41:40 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain Feb 1 04:41:40 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:40 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:40 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:40 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:40 localhost nova_compute[274651]: 2026-02-01 09:41:40.877 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:41:40 localhost nova_compute[274651]: 2026-02-01 09:41:40.877 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:41:40 localhost nova_compute[274651]: 2026-02-01 09:41:40.878 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:41:40 localhost nova_compute[274651]: 2026-02-01 09:41:40.878 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.348 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.402 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.402 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.403 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.403 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.404 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.404 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.427 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.427 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.428 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.428 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.429 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:41:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:41:41 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:41:41 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:41 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:41 localhost systemd[1]: tmp-crun.mDf3rW.mount: Deactivated successfully. Feb 1 04:41:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:41:41.703 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:41:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:41:41.703 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:41:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:41:41.703 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:41:41 localhost podman[281857]: 2026-02-01 09:41:41.711027865 +0000 UTC m=+0.066002613 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:41:41 localhost podman[281857]: 2026-02-01 09:41:41.721321739 +0000 UTC m=+0.076296487 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:41:41 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.820 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.821 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.822 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.822 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.823 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.824 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.895 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.997 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:41:41 localhost nova_compute[274651]: 2026-02-01 09:41:41.997 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.163 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.165 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11544MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.165 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.165 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:41:42 localhost podman[281933]: Feb 1 04:41:42 localhost podman[281933]: 2026-02-01 09:41:42.184930547 +0000 UTC m=+0.072285935 container create e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_payne, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=1764794109, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=) Feb 1 04:41:42 localhost systemd[1]: Started libpod-conmon-e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265.scope. Feb 1 04:41:42 localhost systemd[1]: Started libcrun container. Feb 1 04:41:42 localhost podman[281933]: 2026-02-01 09:41:42.145423242 +0000 UTC m=+0.032778570 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.256 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:41:42 localhost podman[281933]: 2026-02-01 09:41:42.256924943 +0000 UTC m=+0.144280291 container init e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_payne, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7) Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.257 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.257 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:41:42 localhost podman[281933]: 2026-02-01 09:41:42.26468067 +0000 UTC m=+0.152035978 container start e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_payne, version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.buildah.version=1.41.4, GIT_BRANCH=main, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-12-08T17:28:53Z, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public) Feb 1 04:41:42 localhost podman[281933]: 2026-02-01 09:41:42.264932127 +0000 UTC m=+0.152287515 container attach e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_payne, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, name=rhceph) Feb 1 04:41:42 localhost suspicious_payne[281948]: 167 167 Feb 1 04:41:42 localhost systemd[1]: libpod-e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265.scope: Deactivated successfully. Feb 1 04:41:42 localhost podman[281933]: 2026-02-01 09:41:42.271662632 +0000 UTC m=+0.159017960 container died e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_payne, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, version=7, RELEASE=main, io.buildah.version=1.41.4, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, io.openshift.expose-services=, release=1764794109, GIT_BRANCH=main, ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.294 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:41:42 localhost podman[281953]: 2026-02-01 09:41:42.373823148 +0000 UTC m=+0.088562232 container remove e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_payne, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, architecture=x86_64, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux ) Feb 1 04:41:42 localhost systemd[1]: libpod-conmon-e3f02dad3789836d12afcaaa052dd1d049cbc201a444b325e3749ea176089265.scope: Deactivated successfully. Feb 1 04:41:42 localhost ceph-mon[280099]: mon.np0005604212@5(peon).osd e82 _set_new_cache_sizes cache_size:1020054578 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:41:42 localhost ceph-mon[280099]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:41:42 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:41:42 localhost ceph-mon[280099]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:41:42 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:41:42 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:42 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:42 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:41:42 localhost systemd[1]: var-lib-containers-storage-overlay-278f0a3a68fe26162bc10445294ce7641f8958e2845ca32b92e2c70a0cebe046-merged.mount: Deactivated successfully. Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.747 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.755 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.845 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.848 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:41:42 localhost nova_compute[274651]: 2026-02-01 09:41:42.848 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:41:43 localhost podman[282042]: Feb 1 04:41:43 localhost podman[282042]: 2026-02-01 09:41:43.090451132 +0000 UTC m=+0.080658720 container create 2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_zhukovsky, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main) Feb 1 04:41:43 localhost systemd[1]: Started libpod-conmon-2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644.scope. Feb 1 04:41:43 localhost systemd[1]: Started libcrun container. Feb 1 04:41:43 localhost podman[282042]: 2026-02-01 09:41:43.153184565 +0000 UTC m=+0.143392163 container init 2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_zhukovsky, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , ceph=True, version=7, io.buildah.version=1.41.4, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:41:43 localhost podman[282042]: 2026-02-01 09:41:43.058956521 +0000 UTC m=+0.049164149 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:43 localhost podman[282042]: 2026-02-01 09:41:43.16254026 +0000 UTC m=+0.152747848 container start 2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_zhukovsky, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux ) Feb 1 04:41:43 localhost podman[282042]: 2026-02-01 09:41:43.16286276 +0000 UTC m=+0.153070408 container attach 2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_zhukovsky, distribution-scope=public, ceph=True, release=1764794109, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, io.buildah.version=1.41.4, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:41:43 localhost upbeat_zhukovsky[282057]: 167 167 Feb 1 04:41:43 localhost systemd[1]: libpod-2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644.scope: Deactivated successfully. Feb 1 04:41:43 localhost podman[282042]: 2026-02-01 09:41:43.165763678 +0000 UTC m=+0.155971286 container died 2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_zhukovsky, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-type=git, CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, name=rhceph, ceph=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:41:43 localhost podman[282062]: 2026-02-01 09:41:43.265642175 +0000 UTC m=+0.087669075 container remove 2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_zhukovsky, vcs-type=git, version=7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 1 04:41:43 localhost systemd[1]: libpod-conmon-2b195f9308aa76f569527ee9635f17dd3fb081d3add0ddcd35c902ebb680e644.scope: Deactivated successfully. Feb 1 04:41:43 localhost systemd[1]: var-lib-containers-storage-overlay-14be2d403e0d6093b9ac6e5ac886f54ed57607f438ef8c9c3373e8ed0f9a812c-merged.mount: Deactivated successfully. Feb 1 04:41:43 localhost ceph-mon[280099]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:41:43 localhost ceph-mon[280099]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:41:43 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:43 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:43 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:41:43 localhost nova_compute[274651]: 2026-02-01 09:41:43.715 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:43 localhost nova_compute[274651]: 2026-02-01 09:41:43.742 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:41:44 localhost podman[282140]: Feb 1 04:41:44 localhost podman[282140]: 2026-02-01 09:41:44.142422443 +0000 UTC m=+0.071841992 container create 9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_volhard, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., RELEASE=main, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph) Feb 1 04:41:44 localhost systemd[1]: Started libpod-conmon-9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254.scope. Feb 1 04:41:44 localhost systemd[1]: Started libcrun container. Feb 1 04:41:44 localhost podman[282140]: 2026-02-01 09:41:44.121096542 +0000 UTC m=+0.050516081 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:44 localhost podman[282140]: 2026-02-01 09:41:44.22592926 +0000 UTC m=+0.155348799 container init 9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_volhard, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, vcs-type=git, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, RELEASE=main, ceph=True, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:41:44 localhost podman[282140]: 2026-02-01 09:41:44.235570733 +0000 UTC m=+0.164990282 container start 9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_volhard, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, distribution-scope=public, release=1764794109, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=) Feb 1 04:41:44 localhost podman[282140]: 2026-02-01 09:41:44.236041187 +0000 UTC m=+0.165460766 container attach 9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_volhard, build-date=2025-12-08T17:28:53Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, version=7, vcs-type=git, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:41:44 localhost quizzical_volhard[282155]: 167 167 Feb 1 04:41:44 localhost systemd[1]: libpod-9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254.scope: Deactivated successfully. Feb 1 04:41:44 localhost podman[282140]: 2026-02-01 09:41:44.241210795 +0000 UTC m=+0.170630364 container died 9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_volhard, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_CLEAN=True, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4) Feb 1 04:41:44 localhost podman[282160]: 2026-02-01 09:41:44.344747123 +0000 UTC m=+0.092202323 container remove 9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_volhard, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, release=1764794109, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z) Feb 1 04:41:44 localhost systemd[1]: libpod-conmon-9d1f476d26737fc4b59589356e0bf7ac4186308fb25c4365403290f47d328254.scope: Deactivated successfully. Feb 1 04:41:44 localhost systemd[1]: var-lib-containers-storage-overlay-a6d82d747409d0aa4af2d0def567bc7a221060bbcf02da96a374a2e926e415ac-merged.mount: Deactivated successfully. Feb 1 04:41:44 localhost ceph-mon[280099]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:41:44 localhost ceph-mon[280099]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:41:44 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:44 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:44 localhost ceph-mon[280099]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:41:44 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:41:44 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:41:44 localhost ceph-mon[280099]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:41:45 localhost podman[282236]: Feb 1 04:41:45 localhost podman[282236]: 2026-02-01 09:41:45.200220382 +0000 UTC m=+0.085393976 container create b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_yonath, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph) Feb 1 04:41:45 localhost systemd[1]: Started libpod-conmon-b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a.scope. Feb 1 04:41:45 localhost systemd[1]: Started libcrun container. Feb 1 04:41:45 localhost podman[282236]: 2026-02-01 09:41:45.166314588 +0000 UTC m=+0.051488202 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:45 localhost podman[282236]: 2026-02-01 09:41:45.273308331 +0000 UTC m=+0.158481915 container init b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_yonath, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, RELEASE=main, ceph=True, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:41:45 localhost podman[282236]: 2026-02-01 09:41:45.28216268 +0000 UTC m=+0.167336274 container start b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_yonath, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.) Feb 1 04:41:45 localhost podman[282236]: 2026-02-01 09:41:45.283244423 +0000 UTC m=+0.168418007 container attach b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_yonath, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, release=1764794109, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git) Feb 1 04:41:45 localhost happy_yonath[282251]: 167 167 Feb 1 04:41:45 localhost systemd[1]: libpod-b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a.scope: Deactivated successfully. Feb 1 04:41:45 localhost podman[282236]: 2026-02-01 09:41:45.289284137 +0000 UTC m=+0.174457721 container died b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_yonath, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.41.4, architecture=x86_64, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:41:45 localhost podman[282257]: 2026-02-01 09:41:45.384513271 +0000 UTC m=+0.085583940 container remove b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_yonath, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, release=1764794109, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7) Feb 1 04:41:45 localhost systemd[1]: libpod-conmon-b1a00f3e854ae8e4fc123e9a1356e02e6c11ab7df2669774600a4ee78464196a.scope: Deactivated successfully. Feb 1 04:41:45 localhost systemd[1]: var-lib-containers-storage-overlay-070962fa4658817be4ce66c538bd8685085b8fdca5908ef75b3a432cab38fd75-merged.mount: Deactivated successfully. Feb 1 04:41:46 localhost podman[282327]: Feb 1 04:41:46 localhost podman[282327]: 2026-02-01 09:41:46.125813958 +0000 UTC m=+0.085754556 container create 2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True) Feb 1 04:41:46 localhost systemd[1]: Started libpod-conmon-2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58.scope. Feb 1 04:41:46 localhost systemd[1]: Started libcrun container. Feb 1 04:41:46 localhost podman[282327]: 2026-02-01 09:41:46.092315806 +0000 UTC m=+0.052256374 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:46 localhost podman[282327]: 2026-02-01 09:41:46.205743216 +0000 UTC m=+0.165683804 container init 2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , distribution-scope=public) Feb 1 04:41:46 localhost podman[282327]: 2026-02-01 09:41:46.214909025 +0000 UTC m=+0.174849563 container start 2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:41:46 localhost podman[282327]: 2026-02-01 09:41:46.215231055 +0000 UTC m=+0.175171613 container attach 2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, GIT_BRANCH=main, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph) Feb 1 04:41:46 localhost recursing_meninsky[282342]: 167 167 Feb 1 04:41:46 localhost systemd[1]: libpod-2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58.scope: Deactivated successfully. Feb 1 04:41:46 localhost podman[282327]: 2026-02-01 09:41:46.219796475 +0000 UTC m=+0.179737013 container died 2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, CEPH_POINT_RELEASE=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.) Feb 1 04:41:46 localhost podman[282347]: 2026-02-01 09:41:46.326688074 +0000 UTC m=+0.096681210 container remove 2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, version=7, io.buildah.version=1.41.4, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux , ceph=True) Feb 1 04:41:46 localhost systemd[1]: libpod-conmon-2ed424d119b1ffc3592a480e94e6b270ad4cb1c2c02e6700771b58170aa66d58.scope: Deactivated successfully. Feb 1 04:41:46 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571d5747600 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Feb 1 04:41:46 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Feb 1 04:41:46 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Feb 1 04:41:46 localhost ceph-mon[280099]: mon.np0005604212@5(peon) e7 my rank is now 4 (was 5) Feb 1 04:41:46 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571df204000 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Feb 1 04:41:46 localhost ceph-mon[280099]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:41:46 localhost ceph-mon[280099]: paxos.4).electionLogic(28) init, last seen epoch 28 Feb 1 04:41:46 localhost ceph-mon[280099]: mon.np0005604212@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:46 localhost systemd[1]: var-lib-containers-storage-overlay-92f5dabadc983f2e5702cb03b615c7ce8eed777beb855f883d5262bdaea47f18-merged.mount: Deactivated successfully. Feb 1 04:41:46 localhost sshd[282364]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:41:46 localhost nova_compute[274651]: 2026-02-01 09:41:46.825 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:41:47 localhost systemd[1]: tmp-crun.KpZ48e.mount: Deactivated successfully. Feb 1 04:41:47 localhost podman[282366]: 2026-02-01 09:41:47.757126346 +0000 UTC m=+0.110889292 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:41:47 localhost podman[282366]: 2026-02-01 09:41:47.792601978 +0000 UTC m=+0.146364904 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:41:47 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:41:50 localhost ceph-mds[277455]: mds.beacon.mds.np0005604212.tkdkxt missed beacon ack from the monitors Feb 1 04:41:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:41:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:41:50 localhost systemd[1]: tmp-crun.eiR3GA.mount: Deactivated successfully. Feb 1 04:41:50 localhost systemd[1]: tmp-crun.wxh8er.mount: Deactivated successfully. Feb 1 04:41:50 localhost podman[282384]: 2026-02-01 09:41:50.699809956 +0000 UTC m=+0.060004981 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 1 04:41:50 localhost podman[282383]: 2026-02-01 09:41:50.7655086 +0000 UTC m=+0.124630152 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:41:50 localhost podman[282384]: 2026-02-01 09:41:50.77338772 +0000 UTC m=+0.133582765 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:41:50 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:41:50 localhost podman[282383]: 2026-02-01 09:41:50.825410077 +0000 UTC m=+0.184531629 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:41:50 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:41:51 localhost ceph-mon[280099]: paxos.4).electionLogic(29) init, last seen epoch 29, mid-election, bumping Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604212@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604212@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604212@4(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:41:51 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:41:51 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:51 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:41:51 localhost ceph-mon[280099]: Remove daemons mon.np0005604209 Feb 1 04:41:51 localhost ceph-mon[280099]: Reconfiguring mon.np0005604212 (monmap changed)... Feb 1 04:41:51 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:41:51 localhost ceph-mon[280099]: Safe to remove mon.np0005604209: new quorum should be ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213', 'np0005604212'] (from ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213', 'np0005604212']) Feb 1 04:41:51 localhost ceph-mon[280099]: Removing monitor np0005604209 from monmap... Feb 1 04:41:51 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon rm", "name": "np0005604209"} : dispatch Feb 1 04:41:51 localhost ceph-mon[280099]: Removing daemon mon.np0005604209 from np0005604209.localdomain -- ports [] Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604211 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604212 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604215 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604210 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604213 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213 in quorum (ranks 0,1,2,3) Feb 1 04:41:51 localhost ceph-mon[280099]: overall HEALTH_OK Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604211 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604210 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604213 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604215 calling monitor election Feb 1 04:41:51 localhost ceph-mon[280099]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3,4) Feb 1 04:41:51 localhost ceph-mon[280099]: overall HEALTH_OK Feb 1 04:41:51 localhost nova_compute[274651]: 2026-02-01 09:41:51.860 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:51 localhost nova_compute[274651]: 2026-02-01 09:41:51.862 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:51 localhost nova_compute[274651]: 2026-02-01 09:41:51.862 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:51 localhost nova_compute[274651]: 2026-02-01 09:41:51.863 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:51 localhost nova_compute[274651]: 2026-02-01 09:41:51.864 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:51 localhost nova_compute[274651]: 2026-02-01 09:41:51.866 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:52 localhost podman[282482]: Feb 1 04:41:52 localhost podman[282482]: 2026-02-01 09:41:52.205121072 +0000 UTC m=+0.071463350 container create 64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_lichterman, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Feb 1 04:41:52 localhost systemd[1]: Started libpod-conmon-64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c.scope. Feb 1 04:41:52 localhost systemd[1]: Started libcrun container. Feb 1 04:41:52 localhost podman[282482]: 2026-02-01 09:41:52.274284621 +0000 UTC m=+0.140626899 container init 64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_lichterman, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_BRANCH=main) Feb 1 04:41:52 localhost podman[282482]: 2026-02-01 09:41:52.178535612 +0000 UTC m=+0.044877870 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:41:52 localhost podman[282482]: 2026-02-01 09:41:52.283267385 +0000 UTC m=+0.149609623 container start 64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_lichterman, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main) Feb 1 04:41:52 localhost podman[282482]: 2026-02-01 09:41:52.283471461 +0000 UTC m=+0.149813799 container attach 64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_lichterman, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7) Feb 1 04:41:52 localhost romantic_lichterman[282497]: 167 167 Feb 1 04:41:52 localhost systemd[1]: libpod-64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c.scope: Deactivated successfully. Feb 1 04:41:52 localhost podman[282482]: 2026-02-01 09:41:52.292078654 +0000 UTC m=+0.158420982 container died 64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_lichterman, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:41:52 localhost podman[282503]: 2026-02-01 09:41:52.360068877 +0000 UTC m=+0.063529328 container remove 64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_lichterman, version=7, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, release=1764794109, com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True) Feb 1 04:41:52 localhost systemd[1]: libpod-conmon-64f9c3a00bb32eec30ed46b7ed28afac9311a928f490b442f346c4c9a513882c.scope: Deactivated successfully. Feb 1 04:41:52 localhost ceph-mon[280099]: mon.np0005604212@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054728 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:41:52 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:41:52 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:52 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:52 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:52 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:52 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:53 localhost systemd[1]: var-lib-containers-storage-overlay-2d4014481595e5ba5a3666b00c2f2691f3c12e2d554777c066f887272f8b21fc-merged.mount: Deactivated successfully. Feb 1 04:41:53 localhost ceph-mon[280099]: Removed label mon from host np0005604209.localdomain Feb 1 04:41:53 localhost ceph-mon[280099]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:41:53 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:41:53 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:53 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:53 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:41:53 localhost podman[236886]: time="2026-02-01T09:41:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:41:53 localhost podman[236886]: @ - - [01/Feb/2026:09:41:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:41:54 localhost podman[236886]: @ - - [01/Feb/2026:09:41:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18767 "" "Go-http-client/1.1" Feb 1 04:41:54 localhost ceph-mon[280099]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:41:54 localhost ceph-mon[280099]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:41:54 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:54 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:54 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:54 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:41:55 localhost ceph-mon[280099]: Removed label mgr from host np0005604209.localdomain Feb 1 04:41:55 localhost ceph-mon[280099]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:41:55 localhost ceph-mon[280099]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:41:55 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:55 localhost ceph-mon[280099]: Removed label _admin from host np0005604209.localdomain Feb 1 04:41:55 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:55 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:55 localhost ceph-mon[280099]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:41:55 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:41:55 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:41:55 localhost ceph-mon[280099]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:41:56 localhost nova_compute[274651]: 2026-02-01 09:41:56.868 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:56 localhost nova_compute[274651]: 2026-02-01 09:41:56.870 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:41:56 localhost nova_compute[274651]: 2026-02-01 09:41:56.870 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:41:56 localhost nova_compute[274651]: 2026-02-01 09:41:56.870 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:56 localhost nova_compute[274651]: 2026-02-01 09:41:56.908 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:41:56 localhost nova_compute[274651]: 2026-02-01 09:41:56.909 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:41:57 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:57 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:57 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:41:57 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:57 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:41:57 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:41:57 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:57 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:57 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:41:57 localhost ceph-mon[280099]: mon.np0005604212@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:41:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:41:57 localhost systemd[1]: tmp-crun.1hA8g6.mount: Deactivated successfully. Feb 1 04:41:57 localhost podman[282521]: 2026-02-01 09:41:57.736816347 +0000 UTC m=+0.090318337 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1769056855, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, vcs-type=git, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:41:57 localhost podman[282521]: 2026-02-01 09:41:57.776550198 +0000 UTC m=+0.130052218 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1769056855, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7) Feb 1 04:41:57 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:41:58 localhost ceph-mon[280099]: Reconfiguring mon.np0005604213 (monmap changed)... Feb 1 04:41:58 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:41:58 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:58 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:58 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:58 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:41:59 localhost ceph-mon[280099]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:41:59 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:41:59 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:59 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:41:59 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:42:00 localhost ceph-mon[280099]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:42:00 localhost ceph-mon[280099]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:42:00 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:00 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:00 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:42:01 localhost ceph-mon[280099]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:42:01 localhost ceph-mon[280099]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:42:01 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:01 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:01 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:01 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:01 localhost openstack_network_exporter[239441]: ERROR 09:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:42:01 localhost openstack_network_exporter[239441]: Feb 1 04:42:01 localhost openstack_network_exporter[239441]: ERROR 09:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:42:01 localhost openstack_network_exporter[239441]: Feb 1 04:42:01 localhost nova_compute[274651]: 2026-02-01 09:42:01.911 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:02 localhost ceph-mon[280099]: mon.np0005604212@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:42:02 localhost ceph-mon[280099]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:42:02 localhost ceph-mon[280099]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:42:02 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:02 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:02 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:02 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:42:02 localhost podman[282539]: 2026-02-01 09:42:02.72547464 +0000 UTC m=+0.084287633 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:42:02 localhost podman[282539]: 2026-02-01 09:42:02.73796455 +0000 UTC m=+0.096777573 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:42:02 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:42:03 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:42:03 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:42:03 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:03 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:03 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:04 localhost ceph-mon[280099]: Reconfiguring mon.np0005604215 (monmap changed)... Feb 1 04:42:04 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain Feb 1 04:42:04 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:04 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:04 localhost podman[282634]: Feb 1 04:42:04 localhost podman[282634]: 2026-02-01 09:42:04.782423009 +0000 UTC m=+0.057648150 container create 644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_sammet, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container) Feb 1 04:42:04 localhost systemd[1]: Started libpod-conmon-644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830.scope. Feb 1 04:42:04 localhost systemd[1]: Started libcrun container. Feb 1 04:42:04 localhost podman[282634]: 2026-02-01 09:42:04.759484599 +0000 UTC m=+0.034709730 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:04 localhost podman[282634]: 2026-02-01 09:42:04.862746688 +0000 UTC m=+0.137971889 container init 644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_sammet, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=) Feb 1 04:42:04 localhost systemd[1]: tmp-crun.Q7mFfv.mount: Deactivated successfully. Feb 1 04:42:04 localhost podman[282634]: 2026-02-01 09:42:04.877610162 +0000 UTC m=+0.152835303 container start 644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_sammet, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, GIT_CLEAN=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z) Feb 1 04:42:04 localhost podman[282634]: 2026-02-01 09:42:04.877973262 +0000 UTC m=+0.153198443 container attach 644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_sammet, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, vendor=Red Hat, Inc.) Feb 1 04:42:04 localhost romantic_sammet[282649]: 167 167 Feb 1 04:42:04 localhost systemd[1]: libpod-644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830.scope: Deactivated successfully. Feb 1 04:42:04 localhost podman[282634]: 2026-02-01 09:42:04.880252762 +0000 UTC m=+0.155477933 container died 644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_sammet, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-12-08T17:28:53Z, ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.expose-services=) Feb 1 04:42:04 localhost podman[282654]: 2026-02-01 09:42:04.986498122 +0000 UTC m=+0.093300967 container remove 644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_sammet, GIT_CLEAN=True, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, name=rhceph, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=) Feb 1 04:42:04 localhost systemd[1]: libpod-conmon-644b0cf5666c213897ff647160691f643a75014826b9108f4fa88a3205d56830.scope: Deactivated successfully. Feb 1 04:42:05 localhost podman[282676]: Feb 1 04:42:05 localhost podman[282676]: 2026-02-01 09:42:05.250048189 +0000 UTC m=+0.083546469 container create 18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_williams, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container) Feb 1 04:42:05 localhost systemd[1]: Started libpod-conmon-18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246.scope. Feb 1 04:42:05 localhost systemd[1]: Started libcrun container. Feb 1 04:42:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02a6648b1b1af399b209c1bbda119e9a3c45710b5f99323560813e5a3d025e6b/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02a6648b1b1af399b209c1bbda119e9a3c45710b5f99323560813e5a3d025e6b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02a6648b1b1af399b209c1bbda119e9a3c45710b5f99323560813e5a3d025e6b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02a6648b1b1af399b209c1bbda119e9a3c45710b5f99323560813e5a3d025e6b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:05 localhost podman[282676]: 2026-02-01 09:42:05.217760774 +0000 UTC m=+0.051259054 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:05 localhost podman[282676]: 2026-02-01 09:42:05.319066874 +0000 UTC m=+0.152565094 container init 18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_williams, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1764794109, version=7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z) Feb 1 04:42:05 localhost podman[282676]: 2026-02-01 09:42:05.330130191 +0000 UTC m=+0.163628441 container start 18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_williams, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, version=7, io.buildah.version=1.41.4, GIT_BRANCH=main, ceph=True, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:42:05 localhost podman[282676]: 2026-02-01 09:42:05.33043044 +0000 UTC m=+0.163928680 container attach 18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_williams, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.expose-services=, architecture=x86_64, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:42:05 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:05 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:05 localhost systemd[1]: var-lib-containers-storage-overlay-1ca01dca20dfa213a740177386c81ab27e029941e9a4d379de7227ecdbcc91aa-merged.mount: Deactivated successfully. Feb 1 04:42:06 localhost sweet_williams[282691]: [ Feb 1 04:42:06 localhost sweet_williams[282691]: { Feb 1 04:42:06 localhost sweet_williams[282691]: "available": false, Feb 1 04:42:06 localhost sweet_williams[282691]: "ceph_device": false, Feb 1 04:42:06 localhost sweet_williams[282691]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 1 04:42:06 localhost sweet_williams[282691]: "lsm_data": {}, Feb 1 04:42:06 localhost sweet_williams[282691]: "lvs": [], Feb 1 04:42:06 localhost sweet_williams[282691]: "path": "/dev/sr0", Feb 1 04:42:06 localhost sweet_williams[282691]: "rejected_reasons": [ Feb 1 04:42:06 localhost sweet_williams[282691]: "Insufficient space (<5GB)", Feb 1 04:42:06 localhost sweet_williams[282691]: "Has a FileSystem" Feb 1 04:42:06 localhost sweet_williams[282691]: ], Feb 1 04:42:06 localhost sweet_williams[282691]: "sys_api": { Feb 1 04:42:06 localhost sweet_williams[282691]: "actuators": null, Feb 1 04:42:06 localhost sweet_williams[282691]: "device_nodes": "sr0", Feb 1 04:42:06 localhost sweet_williams[282691]: "human_readable_size": "482.00 KB", Feb 1 04:42:06 localhost sweet_williams[282691]: "id_bus": "ata", Feb 1 04:42:06 localhost sweet_williams[282691]: "model": "QEMU DVD-ROM", Feb 1 04:42:06 localhost sweet_williams[282691]: "nr_requests": "2", Feb 1 04:42:06 localhost sweet_williams[282691]: "partitions": {}, Feb 1 04:42:06 localhost sweet_williams[282691]: "path": "/dev/sr0", Feb 1 04:42:06 localhost sweet_williams[282691]: "removable": "1", Feb 1 04:42:06 localhost sweet_williams[282691]: "rev": "2.5+", Feb 1 04:42:06 localhost sweet_williams[282691]: "ro": "0", Feb 1 04:42:06 localhost sweet_williams[282691]: "rotational": "1", Feb 1 04:42:06 localhost sweet_williams[282691]: "sas_address": "", Feb 1 04:42:06 localhost sweet_williams[282691]: "sas_device_handle": "", Feb 1 04:42:06 localhost sweet_williams[282691]: "scheduler_mode": "mq-deadline", Feb 1 04:42:06 localhost sweet_williams[282691]: "sectors": 0, Feb 1 04:42:06 localhost sweet_williams[282691]: "sectorsize": "2048", Feb 1 04:42:06 localhost sweet_williams[282691]: "size": 493568.0, Feb 1 04:42:06 localhost sweet_williams[282691]: "support_discard": "0", Feb 1 04:42:06 localhost sweet_williams[282691]: "type": "disk", Feb 1 04:42:06 localhost sweet_williams[282691]: "vendor": "QEMU" Feb 1 04:42:06 localhost sweet_williams[282691]: } Feb 1 04:42:06 localhost sweet_williams[282691]: } Feb 1 04:42:06 localhost sweet_williams[282691]: ] Feb 1 04:42:06 localhost systemd[1]: libpod-18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246.scope: Deactivated successfully. Feb 1 04:42:06 localhost systemd[1]: libpod-18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246.scope: Consumed 1.043s CPU time. Feb 1 04:42:06 localhost podman[282676]: 2026-02-01 09:42:06.354212122 +0000 UTC m=+1.187710402 container died 18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_williams, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., ceph=True, maintainer=Guillaume Abrioux , distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z) Feb 1 04:42:06 localhost systemd[1]: var-lib-containers-storage-overlay-02a6648b1b1af399b209c1bbda119e9a3c45710b5f99323560813e5a3d025e6b-merged.mount: Deactivated successfully. Feb 1 04:42:06 localhost podman[284764]: 2026-02-01 09:42:06.433993764 +0000 UTC m=+0.071957244 container remove 18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_williams, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:42:06 localhost systemd[1]: libpod-conmon-18348672ea619f2e88fb75756b049ec81f7807f4f967ecc868fda917a775c246.scope: Deactivated successfully. Feb 1 04:42:06 localhost nova_compute[274651]: 2026-02-01 09:42:06.913 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:06 localhost nova_compute[274651]: 2026-02-01 09:42:06.919 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:07 localhost ceph-mon[280099]: mon.np0005604212@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:42:08 localhost ceph-mon[280099]: Removing np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Removing np0005604209.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:08 localhost ceph-mon[280099]: Added label _no_schedule to host np0005604209.localdomain Feb 1 04:42:08 localhost ceph-mon[280099]: Removing np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:08 localhost ceph-mon[280099]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604209.localdomain Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:08 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:09 localhost ceph-mon[280099]: Removing daemon crash.np0005604209 from np0005604209.localdomain -- ports [] Feb 1 04:42:09 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:09 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"} : dispatch Feb 1 04:42:09 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"} : dispatch Feb 1 04:42:09 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"}]': finished Feb 1 04:42:10 localhost ceph-mon[280099]: Removed host np0005604209.localdomain Feb 1 04:42:10 localhost ceph-mon[280099]: Removing key for client.crash.np0005604209.localdomain Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"} : dispatch Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"} : dispatch Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"}]': finished Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:10 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:11 localhost sshd[285135]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:42:11 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 1 04:42:11 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 1 04:42:11 localhost systemd-logind[759]: New session 65 of user tripleo-admin. Feb 1 04:42:11 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 1 04:42:11 localhost systemd[1]: Starting User Manager for UID 1003... Feb 1 04:42:11 localhost systemd[285139]: Queued start job for default target Main User Target. Feb 1 04:42:11 localhost systemd[285139]: Created slice User Application Slice. Feb 1 04:42:11 localhost systemd[285139]: Started Mark boot as successful after the user session has run 2 minutes. Feb 1 04:42:11 localhost systemd[285139]: Started Daily Cleanup of User's Temporary Directories. Feb 1 04:42:11 localhost systemd[285139]: Reached target Paths. Feb 1 04:42:11 localhost systemd[285139]: Reached target Timers. Feb 1 04:42:11 localhost systemd[285139]: Starting D-Bus User Message Bus Socket... Feb 1 04:42:11 localhost systemd[285139]: Starting Create User's Volatile Files and Directories... Feb 1 04:42:11 localhost systemd[285139]: Listening on D-Bus User Message Bus Socket. Feb 1 04:42:11 localhost systemd[285139]: Reached target Sockets. Feb 1 04:42:11 localhost systemd[285139]: Finished Create User's Volatile Files and Directories. Feb 1 04:42:11 localhost systemd[285139]: Reached target Basic System. Feb 1 04:42:11 localhost systemd[285139]: Reached target Main User Target. Feb 1 04:42:11 localhost systemd[285139]: Startup finished in 156ms. Feb 1 04:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:42:11 localhost systemd[1]: Started User Manager for UID 1003. Feb 1 04:42:11 localhost systemd[1]: Started Session 65 of User tripleo-admin. Feb 1 04:42:11 localhost podman[285154]: 2026-02-01 09:42:11.877604432 +0000 UTC m=+0.092211892 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:42:11 localhost podman[285154]: 2026-02-01 09:42:11.912561439 +0000 UTC m=+0.127168909 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:42:11 localhost nova_compute[274651]: 2026-02-01 09:42:11.916 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:11 localhost nova_compute[274651]: 2026-02-01 09:42:11.920 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:11 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:42:12 localhost ceph-mon[280099]: Reconfiguring crash.np0005604210 (monmap changed)... Feb 1 04:42:12 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain Feb 1 04:42:12 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:12 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:12 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:12 localhost ceph-mon[280099]: Reconfiguring mon.np0005604210 (monmap changed)... Feb 1 04:42:12 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:12 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain Feb 1 04:42:12 localhost ceph-mon[280099]: mon.np0005604212@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:42:12 localhost python3[285304]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.103/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 1 04:42:13 localhost python3[285450]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.103/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:42:13 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:13 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:13 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:42:13 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:13 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:13 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:42:13 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:13 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:13 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:14 localhost python3[285595]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.103 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 04:42:14 localhost ceph-mon[280099]: Reconfiguring mon.np0005604211 (monmap changed)... Feb 1 04:42:14 localhost ceph-mon[280099]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain Feb 1 04:42:14 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:14 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:14 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:14 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:15 localhost ceph-mon[280099]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:42:15 localhost ceph-mon[280099]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:42:15 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:15 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:15 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:15 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:16 localhost ceph-mon[280099]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:42:16 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:42:16 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:16 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:16 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:16 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:16 localhost podman[285668]: Feb 1 04:42:16 localhost podman[285668]: 2026-02-01 09:42:16.87293601 +0000 UTC m=+0.076741841 container create 779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_euclid, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, build-date=2025-12-08T17:28:53Z, release=1764794109, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.41.4, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, distribution-scope=public, maintainer=Guillaume Abrioux ) Feb 1 04:42:16 localhost systemd[1]: Started libpod-conmon-779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698.scope. Feb 1 04:42:16 localhost nova_compute[274651]: 2026-02-01 09:42:16.917 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:16 localhost nova_compute[274651]: 2026-02-01 09:42:16.921 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:16 localhost systemd[1]: Started libcrun container. Feb 1 04:42:16 localhost podman[285668]: 2026-02-01 09:42:16.842506362 +0000 UTC m=+0.046312263 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:16 localhost podman[285668]: 2026-02-01 09:42:16.949562197 +0000 UTC m=+0.153368028 container init 779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_euclid, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, architecture=x86_64, build-date=2025-12-08T17:28:53Z, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-type=git) Feb 1 04:42:16 localhost podman[285668]: 2026-02-01 09:42:16.962143311 +0000 UTC m=+0.165949152 container start 779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_euclid, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1764794109, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, GIT_BRANCH=main) Feb 1 04:42:16 localhost podman[285668]: 2026-02-01 09:42:16.962463101 +0000 UTC m=+0.166268982 container attach 779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_euclid, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=7, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, release=1764794109, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:42:16 localhost nice_euclid[285683]: 167 167 Feb 1 04:42:16 localhost systemd[1]: libpod-779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698.scope: Deactivated successfully. Feb 1 04:42:16 localhost podman[285668]: 2026-02-01 09:42:16.965648748 +0000 UTC m=+0.169454579 container died 779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_euclid, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux ) Feb 1 04:42:17 localhost podman[285688]: 2026-02-01 09:42:17.063042878 +0000 UTC m=+0.086879320 container remove 779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nice_euclid, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main) Feb 1 04:42:17 localhost systemd[1]: libpod-conmon-779dc9dfeb33cabcf6dc6f291d1865f493cc82fd23fded2ae2e98630f0799698.scope: Deactivated successfully. Feb 1 04:42:17 localhost ceph-mon[280099]: mon.np0005604212@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:42:17 localhost ceph-mon[280099]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:42:17 localhost ceph-mon[280099]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:42:17 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:17 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:17 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:42:17 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:17 localhost podman[285756]: Feb 1 04:42:17 localhost podman[285756]: 2026-02-01 09:42:17.781152167 +0000 UTC m=+0.055577296 container create 5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_haslett, ceph=True, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:42:17 localhost systemd[1]: Started libpod-conmon-5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a.scope. Feb 1 04:42:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:42:17 localhost systemd[1]: Started libcrun container. Feb 1 04:42:17 localhost podman[285756]: 2026-02-01 09:42:17.848023797 +0000 UTC m=+0.122448966 container init 5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_haslett, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, GIT_BRANCH=main, ceph=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container) Feb 1 04:42:17 localhost podman[285756]: 2026-02-01 09:42:17.75767041 +0000 UTC m=+0.032095499 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:17 localhost podman[285756]: 2026-02-01 09:42:17.860711083 +0000 UTC m=+0.135136152 container start 5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_haslett, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , name=rhceph, architecture=x86_64, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Feb 1 04:42:17 localhost podman[285756]: 2026-02-01 09:42:17.860912149 +0000 UTC m=+0.135337308 container attach 5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_haslett, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, distribution-scope=public, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=) Feb 1 04:42:17 localhost lucid_haslett[285771]: 167 167 Feb 1 04:42:17 localhost systemd[1]: libpod-5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a.scope: Deactivated successfully. Feb 1 04:42:17 localhost podman[285756]: 2026-02-01 09:42:17.863812437 +0000 UTC m=+0.138237536 container died 5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_haslett, RELEASE=main, ceph=True, architecture=x86_64, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1764794109, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, distribution-scope=public, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, CEPH_POINT_RELEASE=, version=7) Feb 1 04:42:17 localhost systemd[1]: tmp-crun.X5uuEd.mount: Deactivated successfully. Feb 1 04:42:17 localhost systemd[1]: var-lib-containers-storage-overlay-c0480a641433265167db94a12581082d9d87138c4a56d631d8dfe4d7eecf3572-merged.mount: Deactivated successfully. Feb 1 04:42:17 localhost podman[285773]: 2026-02-01 09:42:17.920518917 +0000 UTC m=+0.083835308 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:42:17 localhost podman[285773]: 2026-02-01 09:42:17.958535536 +0000 UTC m=+0.121851976 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:42:17 localhost systemd[1]: var-lib-containers-storage-overlay-cb1b08ff35ecfa72100e49afb9cbfaeff988025ec537c68bf97483a8ee7132bf-merged.mount: Deactivated successfully. Feb 1 04:42:17 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:42:18 localhost podman[285787]: 2026-02-01 09:42:18.026516349 +0000 UTC m=+0.151871782 container remove 5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_haslett, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, ceph=True, version=7, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:42:18 localhost systemd[1]: libpod-conmon-5b0686c2ed70ea2ca679864a6eafd06f826a4a6a0515e77076602396c3be879a.scope: Deactivated successfully. Feb 1 04:42:18 localhost ceph-mon[280099]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:42:18 localhost ceph-mon[280099]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:42:18 localhost ceph-mon[280099]: Saving service mon spec with placement label:mon Feb 1 04:42:18 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:18 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:18 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:42:18 localhost podman[285871]: Feb 1 04:42:18 localhost podman[285871]: 2026-02-01 09:42:18.874639084 +0000 UTC m=+0.070969565 container create bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_shtern, version=7, io.buildah.version=1.41.4, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, vcs-type=git, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, release=1764794109, name=rhceph, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:42:18 localhost systemd[1]: Started libpod-conmon-bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679.scope. Feb 1 04:42:18 localhost systemd[1]: Started libcrun container. Feb 1 04:42:18 localhost podman[285871]: 2026-02-01 09:42:18.843099702 +0000 UTC m=+0.039430223 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:18 localhost podman[285871]: 2026-02-01 09:42:18.94960368 +0000 UTC m=+0.145934161 container init bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_shtern, vcs-type=git, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.component=rhceph-container, release=1764794109, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_BRANCH=main) Feb 1 04:42:18 localhost podman[285871]: 2026-02-01 09:42:18.957926334 +0000 UTC m=+0.154256815 container start bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_shtern, CEPH_POINT_RELEASE=, release=1764794109, RELEASE=main, vcs-type=git, name=rhceph, version=7, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux ) Feb 1 04:42:18 localhost podman[285871]: 2026-02-01 09:42:18.958181422 +0000 UTC m=+0.154511943 container attach bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_shtern, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, distribution-scope=public) Feb 1 04:42:18 localhost intelligent_shtern[285888]: 167 167 Feb 1 04:42:18 localhost systemd[1]: libpod-bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679.scope: Deactivated successfully. Feb 1 04:42:18 localhost podman[285871]: 2026-02-01 09:42:18.963228226 +0000 UTC m=+0.159558737 container died bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_shtern, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , ceph=True, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main) Feb 1 04:42:19 localhost podman[285893]: 2026-02-01 09:42:19.06667351 +0000 UTC m=+0.089074687 container remove bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_shtern, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:42:19 localhost systemd[1]: libpod-conmon-bc78d5ddfcae1a02f8a5db7f56a123afa5d57bc789d8ad917f872880163fe679.scope: Deactivated successfully. Feb 1 04:42:19 localhost ceph-mon[280099]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:42:19 localhost ceph-mon[280099]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:42:19 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:19 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:19 localhost ceph-mon[280099]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:19 localhost ceph-mon[280099]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:19 localhost systemd[1]: var-lib-containers-storage-overlay-855719e372703bb4b10e3ac6feb5bd6f82ccd99b3be504a5e5fa2ea8c370e427-merged.mount: Deactivated successfully. Feb 1 04:42:19 localhost podman[285970]: Feb 1 04:42:19 localhost podman[285970]: 2026-02-01 09:42:19.911416429 +0000 UTC m=+0.071686942 container create 96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wright, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, build-date=2025-12-08T17:28:53Z, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:42:19 localhost systemd[1]: Started libpod-conmon-96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0.scope. Feb 1 04:42:19 localhost systemd[1]: Started libcrun container. Feb 1 04:42:19 localhost podman[285970]: 2026-02-01 09:42:19.975857398 +0000 UTC m=+0.136127911 container init 96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wright, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1764794109, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git) Feb 1 04:42:19 localhost podman[285970]: 2026-02-01 09:42:19.884325028 +0000 UTC m=+0.044595581 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:19 localhost podman[285970]: 2026-02-01 09:42:19.992656064 +0000 UTC m=+0.152926587 container start 96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wright, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, RELEASE=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7) Feb 1 04:42:19 localhost podman[285970]: 2026-02-01 09:42:19.993012595 +0000 UTC m=+0.153283108 container attach 96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wright, ceph=True, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, vendor=Red Hat, Inc., GIT_CLEAN=True) Feb 1 04:42:19 localhost crazy_wright[285985]: 167 167 Feb 1 04:42:19 localhost systemd[1]: libpod-96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0.scope: Deactivated successfully. Feb 1 04:42:19 localhost podman[285970]: 2026-02-01 09:42:19.996211934 +0000 UTC m=+0.156482487 container died 96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wright, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, version=7, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, release=1764794109, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:42:20 localhost podman[285990]: 2026-02-01 09:42:20.102092755 +0000 UTC m=+0.092466451 container remove 96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_wright, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, maintainer=Guillaume Abrioux , release=1764794109) Feb 1 04:42:20 localhost systemd[1]: libpod-conmon-96dbe2cf4bb5229617329f2dec0b4a3bcc87a9630748aa2cada001ecb0331fe0.scope: Deactivated successfully. Feb 1 04:42:20 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571df2fe000 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Feb 1 04:42:20 localhost ceph-mon[280099]: mon.np0005604212@4(peon) e8 removed from monmap, suicide. Feb 1 04:42:20 localhost podman[286014]: 2026-02-01 09:42:20.317582724 +0000 UTC m=+0.061228882 container died 8e57e0bb971232fdfcc63e8a0793c2c5695b574e6782300f0926374dc6f8460f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, GIT_BRANCH=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph) Feb 1 04:42:20 localhost podman[286014]: 2026-02-01 09:42:20.352329212 +0000 UTC m=+0.095975340 container remove 8e57e0bb971232fdfcc63e8a0793c2c5695b574e6782300f0926374dc6f8460f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, description=Red Hat Ceph Storage 7, release=1764794109, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, version=7, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main) Feb 1 04:42:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:42:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:42:20 localhost systemd[1]: tmp-crun.JlWhAH.mount: Deactivated successfully. Feb 1 04:42:20 localhost systemd[1]: var-lib-containers-storage-overlay-541c2157ec9b086c448c4437e411de7e20d590726a78db66c8874be5e03bd5ea-merged.mount: Deactivated successfully. Feb 1 04:42:20 localhost systemd[1]: var-lib-containers-storage-overlay-184609069e024ddd5f317f7b50559e6dcd8146dbca3c28ede35f3bafe047c425-merged.mount: Deactivated successfully. Feb 1 04:42:20 localhost systemd[1]: tmp-crun.8w6KjW.mount: Deactivated successfully. Feb 1 04:42:21 localhost podman[286125]: 2026-02-01 09:42:21.001211121 +0000 UTC m=+0.097241877 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:42:21 localhost podman[286124]: 2026-02-01 09:42:20.972085596 +0000 UTC m=+0.074824239 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:42:21 localhost podman[286124]: 2026-02-01 09:42:21.069244241 +0000 UTC m=+0.171982934 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:42:21 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:42:21 localhost podman[286125]: 2026-02-01 09:42:21.091261046 +0000 UTC m=+0.187291842 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 1 04:42:21 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:42:21 localhost systemd[1]: ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e@mon.np0005604212.service: Deactivated successfully. Feb 1 04:42:21 localhost systemd[1]: Stopped Ceph mon.np0005604212 for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 04:42:21 localhost systemd[1]: ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e@mon.np0005604212.service: Consumed 4.059s CPU time. Feb 1 04:42:21 localhost systemd[1]: Reloading. Feb 1 04:42:21 localhost systemd-rc-local-generator[286227]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:42:21 localhost systemd-sysv-generator[286232]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:21 localhost nova_compute[274651]: 2026-02-01 09:42:21.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:23 localhost podman[236886]: time="2026-02-01T09:42:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:42:23 localhost podman[236886]: @ - - [01/Feb/2026:09:42:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154214 "" "Go-http-client/1.1" Feb 1 04:42:24 localhost podman[236886]: @ - - [01/Feb/2026:09:42:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18296 "" "Go-http-client/1.1" Feb 1 04:42:25 localhost podman[286291]: Feb 1 04:42:25 localhost podman[286291]: 2026-02-01 09:42:25.981561563 +0000 UTC m=+0.083527006 container create 714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_brahmagupta, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, release=1764794109, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4) Feb 1 04:42:26 localhost systemd[1]: Started libpod-conmon-714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a.scope. Feb 1 04:42:26 localhost podman[286291]: 2026-02-01 09:42:25.944895357 +0000 UTC m=+0.046860820 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:26 localhost systemd[1]: Started libcrun container. Feb 1 04:42:26 localhost podman[286291]: 2026-02-01 09:42:26.065171701 +0000 UTC m=+0.167137154 container init 714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_brahmagupta, release=1764794109, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7) Feb 1 04:42:26 localhost podman[286291]: 2026-02-01 09:42:26.07520145 +0000 UTC m=+0.177166893 container start 714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_brahmagupta, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, release=1764794109, name=rhceph, GIT_CLEAN=True, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:42:26 localhost podman[286291]: 2026-02-01 09:42:26.078070548 +0000 UTC m=+0.180036021 container attach 714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_brahmagupta, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, release=1764794109, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, RELEASE=main, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7) Feb 1 04:42:26 localhost cool_brahmagupta[286306]: 167 167 Feb 1 04:42:26 localhost systemd[1]: libpod-714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a.scope: Deactivated successfully. Feb 1 04:42:26 localhost podman[286291]: 2026-02-01 09:42:26.081167623 +0000 UTC m=+0.183133096 container died 714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_brahmagupta, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-type=git, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, ceph=True, io.buildah.version=1.41.4, distribution-scope=public, GIT_CLEAN=True, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:42:26 localhost podman[286311]: 2026-02-01 09:42:26.196255738 +0000 UTC m=+0.102148159 container remove 714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_brahmagupta, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1764794109, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:42:26 localhost systemd[1]: libpod-conmon-714df0825916eb66adb7283aff2c1a2b5f4601e0f41754f4f99e2eb4fb32ef7a.scope: Deactivated successfully. Feb 1 04:42:26 localhost nova_compute[274651]: 2026-02-01 09:42:26.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:26 localhost nova_compute[274651]: 2026-02-01 09:42:26.928 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:26 localhost systemd[1]: var-lib-containers-storage-overlay-b29f3b661a022e24eabf6aef6aa623ef6cc7f0165ff131f77dcc240f010dc50d-merged.mount: Deactivated successfully. Feb 1 04:42:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:42:28 localhost podman[286328]: 2026-02-01 09:42:28.73821701 +0000 UTC m=+0.096454363 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, container_name=openstack_network_exporter, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container) Feb 1 04:42:28 localhost podman[286328]: 2026-02-01 09:42:28.753401147 +0000 UTC m=+0.111638530 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:42:28 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:42:31 localhost openstack_network_exporter[239441]: ERROR 09:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:42:31 localhost openstack_network_exporter[239441]: Feb 1 04:42:31 localhost openstack_network_exporter[239441]: ERROR 09:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:42:31 localhost openstack_network_exporter[239441]: Feb 1 04:42:31 localhost nova_compute[274651]: 2026-02-01 09:42:31.925 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:31 localhost nova_compute[274651]: 2026-02-01 09:42:31.930 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:42:33 localhost podman[286363]: 2026-02-01 09:42:33.71841953 +0000 UTC m=+0.078181042 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:42:33 localhost podman[286363]: 2026-02-01 09:42:33.729661965 +0000 UTC m=+0.089423447 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:42:33 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:42:34 localhost podman[286447]: Feb 1 04:42:34 localhost podman[286447]: 2026-02-01 09:42:34.25239833 +0000 UTC m=+0.053789133 container create 742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_ptolemy, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, name=rhceph, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, RELEASE=main, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Guillaume Abrioux ) Feb 1 04:42:34 localhost systemd[1]: Started libpod-conmon-742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef.scope. Feb 1 04:42:34 localhost systemd[1]: Started libcrun container. Feb 1 04:42:34 localhost podman[286447]: 2026-02-01 09:42:34.318946854 +0000 UTC m=+0.120337667 container init 742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_ptolemy, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , release=1764794109, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, RELEASE=main) Feb 1 04:42:34 localhost podman[286447]: 2026-02-01 09:42:34.226848565 +0000 UTC m=+0.028239408 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:34 localhost podman[286447]: 2026-02-01 09:42:34.329317462 +0000 UTC m=+0.130708275 container start 742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_ptolemy, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, build-date=2025-12-08T17:28:53Z, release=1764794109, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, distribution-scope=public) Feb 1 04:42:34 localhost podman[286447]: 2026-02-01 09:42:34.329543919 +0000 UTC m=+0.130934742 container attach 742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_ptolemy, io.buildah.version=1.41.4, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64) Feb 1 04:42:34 localhost affectionate_ptolemy[286462]: 167 167 Feb 1 04:42:34 localhost systemd[1]: libpod-742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef.scope: Deactivated successfully. Feb 1 04:42:34 localhost podman[286447]: 2026-02-01 09:42:34.333541102 +0000 UTC m=+0.134931955 container died 742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_ptolemy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Feb 1 04:42:34 localhost podman[286467]: 2026-02-01 09:42:34.439033872 +0000 UTC m=+0.091856632 container remove 742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_ptolemy, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, release=1764794109, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:42:34 localhost systemd[1]: libpod-conmon-742c35288de317f6f7ad64952722e9acf1a80443d5a163d316cf05085bc876ef.scope: Deactivated successfully. Feb 1 04:42:34 localhost podman[286481]: Feb 1 04:42:34 localhost podman[286481]: 2026-02-01 09:42:34.535350611 +0000 UTC m=+0.057756215 container create 9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_banach, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, build-date=2025-12-08T17:28:53Z, release=1764794109, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:42:34 localhost systemd[1]: Started libpod-conmon-9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c.scope. Feb 1 04:42:34 localhost systemd[1]: Started libcrun container. Feb 1 04:42:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc7c0794fda7fc8eddd9a83d3f82ae845cfd207acd7dd3cc70606d83ecd9e546/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc7c0794fda7fc8eddd9a83d3f82ae845cfd207acd7dd3cc70606d83ecd9e546/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc7c0794fda7fc8eddd9a83d3f82ae845cfd207acd7dd3cc70606d83ecd9e546/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc7c0794fda7fc8eddd9a83d3f82ae845cfd207acd7dd3cc70606d83ecd9e546/merged/var/lib/ceph/mon/ceph-np0005604212 supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:34 localhost podman[286481]: 2026-02-01 09:42:34.595455516 +0000 UTC m=+0.117861120 container init 9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_banach, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vendor=Red Hat, Inc., release=1764794109, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph) Feb 1 04:42:34 localhost podman[286481]: 2026-02-01 09:42:34.61316468 +0000 UTC m=+0.135570304 container start 9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_banach, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=) Feb 1 04:42:34 localhost podman[286481]: 2026-02-01 09:42:34.613460619 +0000 UTC m=+0.135866253 container attach 9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_banach, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-12-08T17:28:53Z, release=1764794109, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:42:34 localhost podman[286481]: 2026-02-01 09:42:34.51383376 +0000 UTC m=+0.036239384 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:34 localhost systemd[1]: libpod-9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c.scope: Deactivated successfully. Feb 1 04:42:34 localhost podman[286481]: 2026-02-01 09:42:34.701696979 +0000 UTC m=+0.224102623 container died 9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_banach, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , release=1764794109, vendor=Red Hat, Inc., RELEASE=main) Feb 1 04:42:34 localhost systemd[1]: var-lib-containers-storage-overlay-6ddff8101f8f228a5f0079f0a210ab0ccaf3f196f0ac9c126fa99d8dc671591b-merged.mount: Deactivated successfully. Feb 1 04:42:34 localhost systemd[1]: var-lib-containers-storage-overlay-dc7c0794fda7fc8eddd9a83d3f82ae845cfd207acd7dd3cc70606d83ecd9e546-merged.mount: Deactivated successfully. Feb 1 04:42:34 localhost podman[286523]: 2026-02-01 09:42:34.802695441 +0000 UTC m=+0.074732496 container remove 9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_banach, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.41.4, architecture=x86_64, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1764794109, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, vcs-type=git, version=7, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Feb 1 04:42:34 localhost systemd[1]: libpod-conmon-9a0223a5a383c116472f808f2522c235a0eaa6dea093335ab1ce038270c9a46c.scope: Deactivated successfully. Feb 1 04:42:34 localhost systemd[1]: Reloading. Feb 1 04:42:34 localhost systemd-rc-local-generator[286561]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:42:34 localhost systemd-sysv-generator[286564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:42:34 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:34 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:34 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:34 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: Reloading. Feb 1 04:42:35 localhost systemd-sysv-generator[286608]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 1 04:42:35 localhost systemd-rc-local-generator[286601]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 1 04:42:35 localhost systemd[1]: Starting Ceph mon.np0005604212 for 33fac0b9-80c7-560f-918a-c92d3021ca1e... Feb 1 04:42:35 localhost podman[286671]: Feb 1 04:42:35 localhost podman[286671]: 2026-02-01 09:42:35.912141466 +0000 UTC m=+0.075165560 container create 4e9c4a1fd3d50d973f9750a0052a648e66e4b933f5e9d7b123235f53c13b8de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, release=1764794109, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Feb 1 04:42:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe2ca532be3241024d45e64d7df7bee638aaa4130be97cbec688f793f2fe18/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe2ca532be3241024d45e64d7df7bee638aaa4130be97cbec688f793f2fe18/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe2ca532be3241024d45e64d7df7bee638aaa4130be97cbec688f793f2fe18/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe2ca532be3241024d45e64d7df7bee638aaa4130be97cbec688f793f2fe18/merged/var/lib/ceph/mon/ceph-np0005604212 supports timestamps until 2038 (0x7fffffff) Feb 1 04:42:35 localhost podman[286671]: 2026-02-01 09:42:35.964735382 +0000 UTC m=+0.127759476 container init 4e9c4a1fd3d50d973f9750a0052a648e66e4b933f5e9d7b123235f53c13b8de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, release=1764794109, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, architecture=x86_64, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 1 04:42:35 localhost podman[286671]: 2026-02-01 09:42:35.977138132 +0000 UTC m=+0.140162226 container start 4e9c4a1fd3d50d973f9750a0052a648e66e4b933f5e9d7b123235f53c13b8de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604212, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, maintainer=Guillaume Abrioux , release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 1 04:42:35 localhost bash[286671]: 4e9c4a1fd3d50d973f9750a0052a648e66e4b933f5e9d7b123235f53c13b8de8 Feb 1 04:42:35 localhost podman[286671]: 2026-02-01 09:42:35.886372005 +0000 UTC m=+0.049396209 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:35 localhost systemd[1]: Started Ceph mon.np0005604212 for 33fac0b9-80c7-560f-918a-c92d3021ca1e. Feb 1 04:42:36 localhost ceph-mon[286721]: set uid:gid to 167:167 (ceph:ceph) Feb 1 04:42:36 localhost ceph-mon[286721]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Feb 1 04:42:36 localhost ceph-mon[286721]: pidfile_write: ignore empty --pid-file Feb 1 04:42:36 localhost ceph-mon[286721]: load: jerasure load: lrc Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: RocksDB version: 7.9.2 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Git sha 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Compile date 2025-09-23 00:00:00 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: DB SUMMARY Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: DB Session ID: 0OACS8BUSD4GZ2BGBVU8 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: CURRENT file: CURRENT Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: IDENTITY file: IDENTITY Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005604212/store.db dir, Total Num: 0, files: Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005604212/store.db: 000004.log size: 886 ; Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.error_if_exists: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.create_if_missing: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.paranoid_checks: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.env: 0x558a9057e9e0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.fs: PosixFileSystem Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.info_log: 0x558a91facd20 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_file_opening_threads: 16 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.statistics: (nil) Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.use_fsync: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_log_file_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.log_file_time_to_roll: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.keep_log_file_num: 1000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.recycle_log_file_num: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.allow_fallocate: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.allow_mmap_reads: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.allow_mmap_writes: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.use_direct_reads: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.create_missing_column_families: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.db_log_dir: Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.wal_dir: Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.table_cache_numshardbits: 6 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.advise_random_on_open: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.db_write_buffer_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.write_buffer_manager: 0x558a91fbd540 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.use_adaptive_mutex: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.rate_limiter: (nil) Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.wal_recovery_mode: 2 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.enable_thread_tracking: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.enable_pipelined_write: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.unordered_write: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.row_cache: None Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.wal_filter: None Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.allow_ingest_behind: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.two_write_queues: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.manual_wal_flush: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.wal_compression: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.atomic_flush: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.persist_stats_to_disk: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.log_readahead_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.best_efforts_recovery: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.allow_data_in_errors: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.db_host_id: __hostname__ Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.enforce_single_del_contracts: true Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_background_jobs: 2 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_background_compactions: -1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_subcompactions: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.delayed_write_rate : 16777216 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_total_wal_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.stats_dump_period_sec: 600 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.stats_persist_period_sec: 600 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_open_files: -1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bytes_per_sync: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_readahead_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_background_flushes: -1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Compression algorithms supported: Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kZSTD supported: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kXpressCompression supported: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kBZip2Compression supported: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kLZ4Compression supported: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kZlibCompression supported: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: #011kSnappyCompression supported: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: DMutex implementation: pthread_mutex_t Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005604212/store.db/MANIFEST-000005 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.merge_operator: Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_filter: None Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_filter_factory: None Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.sst_partitioner_factory: None Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.memtable_factory: SkipListFactory Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.table_factory: BlockBasedTable Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x558a91fac980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x558a91fa9350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.write_buffer_size: 33554432 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_write_buffer_number: 2 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression: NoCompression Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression: Disabled Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.prefix_extractor: nullptr Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.num_levels: 7 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.window_bits: -14 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.level: 32767 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.strategy: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.enabled: false Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.target_file_size_base: 67108864 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.target_file_size_multiplier: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_base: 268435456 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.arena_block_size: 1048576 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.disable_auto_compactions: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.table_properties_collectors: Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.inplace_update_support: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.memtable_huge_page_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.bloom_locality: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.max_successive_merges: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.paranoid_file_checks: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.force_consistency_checks: 1 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.report_bg_io_stats: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.ttl: 2592000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.enable_blob_files: false Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.min_blob_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.blob_file_size: 268435456 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.blob_compression_type: NoCompression Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.enable_blob_garbage_collection: false Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.blob_file_starting_level: 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005604212/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 45378c7f-5201-4192-8849-dfb55e3150db Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938956037963, "job": 1, "event": "recovery_started", "wal_files": [4]} Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938956040074, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938956040185, "job": 1, "event": "recovery_finished"} Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x558a91fd0e00 Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: DB pointer 0x558a920c6000 Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212 does not exist in monmap, will attempt to join an existing cluster Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:42:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558a91fa9350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 4.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,1.08 KB,0.000205636%)#012#012** File Read Latency Histogram By Level [default] ** Feb 1 04:42:36 localhost ceph-mon[286721]: using public_addr v2:172.18.0.103:0/0 -> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] Feb 1 04:42:36 localhost ceph-mon[286721]: starting mon.np0005604212 rank -1 at public addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] at bind addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005604212 fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(???) e0 preinit fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing) e8 sync_obtain_latest_monmap Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing) e8 sync_obtain_latest_monmap obtained monmap e8 Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing).mds e16 new map Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing).mds e16 print_map#012e16#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-01T07:59:04.480309+0000#012modified#0112026-02-01T09:39:55.510678+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01179#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26329}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26329 members: 26329#012[mds.mds.np0005604212.tkdkxt{0:26329} state up:active seq 12 addr [v2:172.18.0.106:6808/1133321306,v1:172.18.0.106:6809/1133321306] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005604215.rwvxvg{-1:16872} state up:standby seq 1 addr [v2:172.18.0.108:6808/2262553558,v1:172.18.0.108:6809/2262553558] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005604213.jdbvyh{-1:16878} state up:standby seq 1 addr [v2:172.18.0.107:6808/3323601884,v1:172.18.0.107:6809/3323601884] compat {c=[1],r=[1],i=[17ff]}] Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing).osd e82 crush map has features 3314933000852226048, adjusting msgr requires Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing).osd e82 crush map has features 288514051259236352, adjusting msgr requires Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing).osd e82 crush map has features 288514051259236352, adjusting msgr requires Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing).osd e82 crush map has features 288514051259236352, adjusting msgr requires Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604210 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mon.np0005604210 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='client.? 172.18.0.103:0/3887042624' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Activating manager daemon np0005604211.cuflqz Feb 1 04:42:36 localhost ceph-mon[286721]: from='client.? 172.18.0.103:0/3887042624' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:42:36 localhost ceph-mon[286721]: Manager daemon np0005604211.cuflqz is now available Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: [01/Feb/2026:09:41:32] ENGINE Bus STARTING Feb 1 04:42:36 localhost ceph-mon[286721]: [01/Feb/2026:09:41:32] ENGINE Serving on https://172.18.0.105:7150 Feb 1 04:42:36 localhost ceph-mon[286721]: [01/Feb/2026:09:41:32] ENGINE Client ('172.18.0.105', 36928) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:42:36 localhost ceph-mon[286721]: [01/Feb/2026:09:41:32] ENGINE Serving on http://172.18.0.105:8765 Feb 1 04:42:36 localhost ceph-mon[286721]: [01/Feb/2026:09:41:32] ENGINE Bus STARTED Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604209", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604209", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:42:36 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:42:36 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:42:36 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604209.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604209.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mon.np0005604211 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Remove daemons mon.np0005604209 Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mon.np0005604212 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Safe to remove mon.np0005604209: new quorum should be ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213', 'np0005604212'] (from ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213', 'np0005604212']) Feb 1 04:42:36 localhost ceph-mon[286721]: Removing monitor np0005604209 from monmap... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon rm", "name": "np0005604209"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Removing daemon mon.np0005604209 from np0005604209.localdomain -- ports [] Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604210 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213 in quorum (ranks 0,1,2,3) Feb 1 04:42:36 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604210 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3,4) Feb 1 04:42:36 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Removed label mon from host np0005604209.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Removed label mgr from host np0005604209.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Removed label _admin from host np0005604209.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mon.np0005604213 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mon.np0005604215 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Removing np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Removing np0005604209.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Added label _no_schedule to host np0005604209.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Removing np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:36 localhost ceph-mon[286721]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604209.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Removing daemon crash.np0005604209 from np0005604209.localdomain -- ports [] Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"}]': finished Feb 1 04:42:36 localhost ceph-mon[286721]: Removed host np0005604209.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Removing key for client.crash.np0005604209.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"}]': finished Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604210 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mon.np0005604210 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mon.np0005604211 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: Saving service mon spec with placement label:mon Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Remove daemons mon.np0005604212 Feb 1 04:42:36 localhost ceph-mon[286721]: Safe to remove mon.np0005604212: new quorum should be ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213'] (from ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213']) Feb 1 04:42:36 localhost ceph-mon[286721]: Removing monitor np0005604212 from monmap... Feb 1 04:42:36 localhost ceph-mon[286721]: Removing daemon mon.np0005604212 from np0005604212.localdomain -- ports [] Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604210 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: Health check failed: 1/4 mons down, quorum np0005604211,np0005604215,np0005604213 (MON_DOWN) Feb 1 04:42:36 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213 in quorum (ranks 0,1,2,3) Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Health check cleared: MON_DOWN (was: 1/4 mons down, quorum np0005604211,np0005604215,np0005604213) Feb 1 04:42:36 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:42:36 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Deploying daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:42:36 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:36 localhost ceph-mon[286721]: mon.np0005604212@-1(synchronizing).paxosservice(auth 1..36) refresh upgraded, format 0 -> 3 Feb 1 04:42:36 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571df2fe160 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Feb 1 04:42:36 localhost podman[286833]: 2026-02-01 09:42:36.911142749 +0000 UTC m=+0.106894845 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4, GIT_CLEAN=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, ceph=True, release=1764794109, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:42:36 localhost nova_compute[274651]: 2026-02-01 09:42:36.929 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:37 localhost podman[286833]: 2026-02-01 09:42:37.026588385 +0000 UTC m=+0.222340491 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109) Feb 1 04:42:38 localhost ceph-mon[286721]: mon.np0005604212@-1(probing) e9 my rank is now 4 (was -1) Feb 1 04:42:38 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:42:38 localhost ceph-mon[286721]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Feb 1 04:42:38 localhost ceph-mon[286721]: mon.np0005604212@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:42:39 localhost nova_compute[274651]: 2026-02-01 09:42:39.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:39 localhost nova_compute[274651]: 2026-02-01 09:42:39.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.296 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:42:40 localhost nova_compute[274651]: 2026-02-01 09:42:40.296 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:42:40 localhost ceph-mon[286721]: mon.np0005604212@4(electing) e9 handle_auth_request failed to assign global_id Feb 1 04:42:40 localhost ceph-mon[286721]: mon.np0005604212@4(electing) e9 handle_auth_request failed to assign global_id Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(electing) e9 handle_auth_request failed to assign global_id Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(electing) e9 handle_auth_request failed to assign global_id Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:42:41 localhost ceph-mon[286721]: mgrc update_daemon_metadata mon.np0005604212 metadata {addrs=[v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005604212.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005604212.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604210 calling monitor election Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3,4) Feb 1 04:42:41 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:42:41 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:41 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 handle_auth_request failed to assign global_id Feb 1 04:42:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:42:41.704 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:42:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:42:41.705 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:42:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:42:41.706 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:42:41 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 handle_auth_request failed to assign global_id Feb 1 04:42:41 localhost nova_compute[274651]: 2026-02-01 09:42:41.935 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.214 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.918s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:42:42 localhost systemd[1]: tmp-crun.Qz31YG.mount: Deactivated successfully. Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.454 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.455 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:42:42 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:42:42 localhost podman[287065]: 2026-02-01 09:42:42.496872605 +0000 UTC m=+0.138575368 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:42:42 localhost podman[287065]: 2026-02-01 09:42:42.508455821 +0000 UTC m=+0.150158564 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:42:42 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:42:42 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 handle_auth_request failed to assign global_id Feb 1 04:42:42 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 handle_auth_request failed to assign global_id Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.661 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.662 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11565MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.662 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.662 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.814 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.814 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.814 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:42:42 localhost nova_compute[274651]: 2026-02-01 09:42:42.853 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:42:42 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 handle_auth_request failed to assign global_id Feb 1 04:42:43 localhost nova_compute[274651]: 2026-02-01 09:42:43.263 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:42:43 localhost nova_compute[274651]: 2026-02-01 09:42:43.271 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:42:43 localhost nova_compute[274651]: 2026-02-01 09:42:43.300 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:42:43 localhost nova_compute[274651]: 2026-02-01 09:42:43.303 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:42:43 localhost nova_compute[274651]: 2026-02-01 09:42:43.304 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:42:43 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:43 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:43 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:43 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:43 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:44 localhost nova_compute[274651]: 2026-02-01 09:42:44.307 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:44 localhost nova_compute[274651]: 2026-02-01 09:42:44.308 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:42:44 localhost nova_compute[274651]: 2026-02-01 09:42:44.308 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:42:44 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:44 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:44 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:44 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:44 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:44 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:44 localhost nova_compute[274651]: 2026-02-01 09:42:44.948 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:42:44 localhost nova_compute[274651]: 2026-02-01 09:42:44.949 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:42:44 localhost nova_compute[274651]: 2026-02-01 09:42:44.950 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:42:44 localhost nova_compute[274651]: 2026-02-01 09:42:44.950 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:42:45 localhost ceph-mon[286721]: Reconfiguring crash.np0005604210 (monmap changed)... Feb 1 04:42:45 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain Feb 1 04:42:45 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:45 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:45 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:45 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:45 localhost nova_compute[274651]: 2026-02-01 09:42:45.992 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:42:46 localhost nova_compute[274651]: 2026-02-01 09:42:46.186 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:42:46 localhost nova_compute[274651]: 2026-02-01 09:42:46.187 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:42:46 localhost nova_compute[274651]: 2026-02-01 09:42:46.188 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:46 localhost nova_compute[274651]: 2026-02-01 09:42:46.188 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:46 localhost nova_compute[274651]: 2026-02-01 09:42:46.188 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:42:46 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:42:46 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:42:46 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:46 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:46 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:46 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:46 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:42:46 localhost nova_compute[274651]: 2026-02-01 09:42:46.934 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:46 localhost nova_compute[274651]: 2026-02-01 09:42:46.939 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:47 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:42:47 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:42:47 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:47 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:47 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:47 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:42:48 localhost podman[287481]: 2026-02-01 09:42:48.384651335 +0000 UTC m=+0.063107278 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Feb 1 04:42:48 localhost podman[287481]: 2026-02-01 09:42:48.38837101 +0000 UTC m=+0.066826923 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127) Feb 1 04:42:48 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:42:48 localhost podman[287489]: Feb 1 04:42:48 localhost podman[287489]: 2026-02-01 09:42:48.447259849 +0000 UTC m=+0.107641918 container create ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_mayer, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux , release=1764794109, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, name=rhceph, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4) Feb 1 04:42:48 localhost podman[287489]: 2026-02-01 09:42:48.373350118 +0000 UTC m=+0.033732207 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:48 localhost systemd[1]: Started libpod-conmon-ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819.scope. Feb 1 04:42:48 localhost systemd[1]: Started libcrun container. Feb 1 04:42:48 localhost podman[287489]: 2026-02-01 09:42:48.532702043 +0000 UTC m=+0.193084142 container init ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_mayer, io.openshift.tags=rhceph ceph, name=rhceph, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, ceph=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=) Feb 1 04:42:48 localhost podman[287489]: 2026-02-01 09:42:48.551532731 +0000 UTC m=+0.211914810 container start ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_mayer, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, ceph=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:42:48 localhost podman[287489]: 2026-02-01 09:42:48.551766958 +0000 UTC m=+0.212149097 container attach ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_mayer, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, release=1764794109, com.redhat.component=rhceph-container, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, distribution-scope=public) Feb 1 04:42:48 localhost systemd[1]: libpod-ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819.scope: Deactivated successfully. Feb 1 04:42:48 localhost recursing_mayer[287517]: 167 167 Feb 1 04:42:48 localhost podman[287489]: 2026-02-01 09:42:48.564865351 +0000 UTC m=+0.225247600 container died ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_mayer, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, CEPH_POINT_RELEASE=, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main) Feb 1 04:42:48 localhost ceph-mon[286721]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:42:48 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:42:48 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:48 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:48 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:48 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:42:48 localhost podman[287522]: 2026-02-01 09:42:48.649581623 +0000 UTC m=+0.077555804 container remove ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_mayer, architecture=x86_64, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, CEPH_POINT_RELEASE=, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:42:48 localhost systemd[1]: libpod-conmon-ee6e814ed8827f58e39193c9c21736ed696b23fb7cde02a9fadbb08ca261e819.scope: Deactivated successfully. Feb 1 04:42:49 localhost podman[287592]: Feb 1 04:42:49 localhost podman[287592]: 2026-02-01 09:42:49.358868227 +0000 UTC m=+0.054677421 container create f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_dewdney, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, ceph=True, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True) Feb 1 04:42:49 localhost systemd[1]: tmp-crun.VeGlMi.mount: Deactivated successfully. Feb 1 04:42:49 localhost systemd[1]: var-lib-containers-storage-overlay-35bf6459a63ac4fb053f6f2427897cd0b98e1bf7350ad4ffa86ce74b91a965f1-merged.mount: Deactivated successfully. Feb 1 04:42:49 localhost systemd[1]: Started libpod-conmon-f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881.scope. Feb 1 04:42:49 localhost systemd[1]: Started libcrun container. Feb 1 04:42:49 localhost podman[287592]: 2026-02-01 09:42:49.431066225 +0000 UTC m=+0.126875409 container init f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_dewdney, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, ceph=True, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:42:49 localhost podman[287592]: 2026-02-01 09:42:49.334371064 +0000 UTC m=+0.030180278 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:49 localhost podman[287592]: 2026-02-01 09:42:49.438473562 +0000 UTC m=+0.134282766 container start f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_dewdney, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=7, GIT_BRANCH=main, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True) Feb 1 04:42:49 localhost podman[287592]: 2026-02-01 09:42:49.43873166 +0000 UTC m=+0.134540854 container attach f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_dewdney, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:42:49 localhost laughing_dewdney[287607]: 167 167 Feb 1 04:42:49 localhost systemd[1]: libpod-f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881.scope: Deactivated successfully. Feb 1 04:42:49 localhost podman[287592]: 2026-02-01 09:42:49.444310381 +0000 UTC m=+0.140119615 container died f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_dewdney, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, release=1764794109, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, vcs-type=git, RELEASE=main) Feb 1 04:42:49 localhost podman[287612]: 2026-02-01 09:42:49.521583024 +0000 UTC m=+0.064514592 container remove f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_dewdney, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.buildah.version=1.41.4, release=1764794109, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, name=rhceph, build-date=2025-12-08T17:28:53Z, distribution-scope=public, description=Red Hat Ceph Storage 7) Feb 1 04:42:49 localhost systemd[1]: libpod-conmon-f48b689eecd0a181116b0e6fc5b8b2e2eb4f7cad4d201664947915b5db936881.scope: Deactivated successfully. Feb 1 04:42:49 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:42:49 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:49 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost podman[287687]: Feb 1 04:42:50 localhost podman[287687]: 2026-02-01 09:42:50.303876421 +0000 UTC m=+0.071934520 container create cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_murdock, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, release=1764794109, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public) Feb 1 04:42:50 localhost systemd[1]: Started libpod-conmon-cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be.scope. Feb 1 04:42:50 localhost systemd[1]: Started libcrun container. Feb 1 04:42:50 localhost podman[287687]: 2026-02-01 09:42:50.35853278 +0000 UTC m=+0.126590879 container init cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_murdock, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=1764794109, build-date=2025-12-08T17:28:53Z, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:42:50 localhost podman[287687]: 2026-02-01 09:42:50.367459294 +0000 UTC m=+0.135517363 container start cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_murdock, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=main, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vcs-type=git, version=7) Feb 1 04:42:50 localhost podman[287687]: 2026-02-01 09:42:50.36764089 +0000 UTC m=+0.135698999 container attach cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_murdock, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, release=1764794109, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:42:50 localhost bold_murdock[287702]: 167 167 Feb 1 04:42:50 localhost systemd[1]: libpod-cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be.scope: Deactivated successfully. Feb 1 04:42:50 localhost podman[287687]: 2026-02-01 09:42:50.370317772 +0000 UTC m=+0.138375881 container died cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_murdock, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=1764794109, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, version=7, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container) Feb 1 04:42:50 localhost podman[287687]: 2026-02-01 09:42:50.272612391 +0000 UTC m=+0.040670510 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:50 localhost systemd[1]: tmp-crun.O3SdOf.mount: Deactivated successfully. Feb 1 04:42:50 localhost systemd[1]: var-lib-containers-storage-overlay-9324bae8a23420c18a029a003e665d2dd6f163f98834eade0c37b761a05be63e-merged.mount: Deactivated successfully. Feb 1 04:42:50 localhost systemd[1]: var-lib-containers-storage-overlay-b7b1359e100a194335c16de478616cd46b2b09a919fd5d1cfa6a7bbd73c16d09-merged.mount: Deactivated successfully. Feb 1 04:42:50 localhost podman[287707]: 2026-02-01 09:42:50.462025649 +0000 UTC m=+0.083231848 container remove cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_murdock, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:42:50 localhost systemd[1]: libpod-conmon-cf1ea5819411572510d318fc3e11ed8b73135326305c2bf44bde7f7c635461be.scope: Deactivated successfully. Feb 1 04:42:50 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:42:50 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:42:50 localhost ceph-mon[286721]: Reconfig service osd.default_drive_group Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:50 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e82 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Feb 1 04:42:50 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e82 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Feb 1 04:42:50 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e83 e83: 6 total, 6 up, 6 in Feb 1 04:42:51 localhost systemd-logind[759]: Session 64 logged out. Waiting for processes to exit. Feb 1 04:42:51 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e83 _set_new_cache_sizes cache_size:1019650627 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:42:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:42:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:42:51 localhost sshd[287833]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:42:51 localhost podman[287782]: 2026-02-01 09:42:51.359943796 +0000 UTC m=+0.095593147 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:42:51 localhost podman[287804]: Feb 1 04:42:51 localhost podman[287804]: 2026-02-01 09:42:51.369906173 +0000 UTC m=+0.075382737 container create 5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_herschel, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, release=1764794109, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, distribution-scope=public) Feb 1 04:42:51 localhost podman[287783]: 2026-02-01 09:42:51.323421355 +0000 UTC m=+0.059418966 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:42:51 localhost podman[287782]: 2026-02-01 09:42:51.393290891 +0000 UTC m=+0.128940202 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:42:51 localhost systemd[1]: Started libpod-conmon-5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384.scope. Feb 1 04:42:51 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:42:51 localhost podman[287783]: 2026-02-01 09:42:51.414315957 +0000 UTC m=+0.150313598 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:42:51 localhost systemd[1]: tmp-crun.LiwaQZ.mount: Deactivated successfully. Feb 1 04:42:51 localhost systemd[1]: Started libcrun container. Feb 1 04:42:51 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:42:51 localhost podman[287804]: 2026-02-01 09:42:51.438566601 +0000 UTC m=+0.144043165 container init 5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_herschel, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, version=7, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:42:51 localhost podman[287804]: 2026-02-01 09:42:51.446881066 +0000 UTC m=+0.152357670 container start 5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_herschel, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Feb 1 04:42:51 localhost podman[287804]: 2026-02-01 09:42:51.449039023 +0000 UTC m=+0.154515627 container attach 5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_herschel, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1764794109, GIT_BRANCH=main, distribution-scope=public, RELEASE=main, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:42:51 localhost goofy_herschel[287850]: 167 167 Feb 1 04:42:51 localhost systemd[1]: libpod-5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384.scope: Deactivated successfully. Feb 1 04:42:51 localhost podman[287804]: 2026-02-01 09:42:51.349764354 +0000 UTC m=+0.055240948 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:51 localhost podman[287804]: 2026-02-01 09:42:51.451031764 +0000 UTC m=+0.156508338 container died 5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_herschel, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, name=rhceph, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, release=1764794109, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , ceph=True, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=) Feb 1 04:42:51 localhost systemd-logind[759]: New session 67 of user ceph-admin. Feb 1 04:42:51 localhost systemd[1]: Started Session 67 of User ceph-admin. Feb 1 04:42:51 localhost podman[287856]: 2026-02-01 09:42:51.517036241 +0000 UTC m=+0.059510018 container remove 5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_herschel, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:42:51 localhost systemd[1]: libpod-conmon-5c5377ffc5c8aea55107b662ee84dc0cded0e4dacfb2555e76d92da68c7b2384.scope: Deactivated successfully. Feb 1 04:42:51 localhost systemd[1]: session-64.scope: Deactivated successfully. Feb 1 04:42:51 localhost systemd[1]: session-64.scope: Consumed 26.155s CPU time. Feb 1 04:42:51 localhost systemd-logind[759]: Removed session 64. Feb 1 04:42:51 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:42:51 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/1066355409' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: Activating manager daemon np0005604213.caiaeh Feb 1 04:42:51 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:42:51 localhost ceph-mon[286721]: Manager daemon np0005604213.caiaeh is now available Feb 1 04:42:51 localhost ceph-mon[286721]: removing stray HostCache host record np0005604209.localdomain.devices.0 Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"}]': finished Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"}]': finished Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} : dispatch Feb 1 04:42:51 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} : dispatch Feb 1 04:42:51 localhost nova_compute[274651]: 2026-02-01 09:42:51.936 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:52 localhost systemd[1]: var-lib-containers-storage-overlay-5d79eccc6c6293f21e4fe358dcd944820dcf78ffbe58ab0e3150df8f3d7713a9-merged.mount: Deactivated successfully. Feb 1 04:42:52 localhost systemd[1]: tmp-crun.lMVxbo.mount: Deactivated successfully. Feb 1 04:42:52 localhost podman[287983]: 2026-02-01 09:42:52.521251594 +0000 UTC m=+0.112392063 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_BRANCH=main, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:42:52 localhost podman[287983]: 2026-02-01 09:42:52.632379368 +0000 UTC m=+0.223519807 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-12-08T17:28:53Z, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1764794109, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, version=7, ceph=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Feb 1 04:42:52 localhost ceph-mon[286721]: [01/Feb/2026:09:42:52] ENGINE Bus STARTING Feb 1 04:42:52 localhost ceph-mon[286721]: [01/Feb/2026:09:42:52] ENGINE Serving on http://172.18.0.107:8765 Feb 1 04:42:52 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:53 localhost podman[236886]: time="2026-02-01T09:42:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:42:53 localhost podman[236886]: @ - - [01/Feb/2026:09:42:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:42:54 localhost podman[236886]: @ - - [01/Feb/2026:09:42:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1" Feb 1 04:42:54 localhost ceph-mon[286721]: [01/Feb/2026:09:42:52] ENGINE Serving on https://172.18.0.107:7150 Feb 1 04:42:54 localhost ceph-mon[286721]: [01/Feb/2026:09:42:52] ENGINE Bus STARTED Feb 1 04:42:54 localhost ceph-mon[286721]: [01/Feb/2026:09:42:52] ENGINE Client ('172.18.0.107', 41850) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:54 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e83 _set_new_cache_sizes cache_size:1020046009 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:42:56 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:42:56 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:42:56 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:42:56 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:42:56 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:42:56 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:56 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:42:56 localhost nova_compute[274651]: 2026-02-01 09:42:56.938 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:42:57 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:57 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:57 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:57 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:57 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:42:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:42:58 localhost systemd[1]: tmp-crun.513IQ1.mount: Deactivated successfully. Feb 1 04:42:58 localhost podman[288920]: 2026-02-01 09:42:58.919543906 +0000 UTC m=+0.092453870 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, config_id=openstack_network_exporter, version=9.7, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:42:58 localhost podman[288920]: 2026-02-01 09:42:58.936346873 +0000 UTC m=+0.109256877 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, version=9.7, vendor=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:42:58 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:42:59 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:59 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:59 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:59 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:59 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:59 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:42:59 localhost podman[288954]: Feb 1 04:42:59 localhost podman[288954]: 2026-02-01 09:42:59.29639287 +0000 UTC m=+0.077606004 container create 2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_solomon, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7) Feb 1 04:42:59 localhost systemd[1]: Started libpod-conmon-2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d.scope. Feb 1 04:42:59 localhost systemd[1]: Started libcrun container. Feb 1 04:42:59 localhost podman[288954]: 2026-02-01 09:42:59.264538892 +0000 UTC m=+0.045752066 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:42:59 localhost podman[288954]: 2026-02-01 09:42:59.366912977 +0000 UTC m=+0.148126111 container init 2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_solomon, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container) Feb 1 04:42:59 localhost podman[288954]: 2026-02-01 09:42:59.377739019 +0000 UTC m=+0.158952153 container start 2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_solomon, version=7, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:42:59 localhost podman[288954]: 2026-02-01 09:42:59.378098471 +0000 UTC m=+0.159311605 container attach 2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_solomon, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, name=rhceph, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:42:59 localhost stupefied_solomon[288970]: 167 167 Feb 1 04:42:59 localhost systemd[1]: libpod-2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d.scope: Deactivated successfully. Feb 1 04:42:59 localhost podman[288954]: 2026-02-01 09:42:59.380965989 +0000 UTC m=+0.162179153 container died 2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_solomon, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Feb 1 04:42:59 localhost podman[288975]: 2026-02-01 09:42:59.479105263 +0000 UTC m=+0.091240474 container remove 2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_solomon, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1764794109, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc.) Feb 1 04:42:59 localhost systemd[1]: libpod-conmon-2dd218b73c1fd235cc3bb16b8666995355780029ef8e9d977f342cf0a84f6e9d.scope: Deactivated successfully. Feb 1 04:42:59 localhost systemd[1]: var-lib-containers-storage-overlay-e2b6a8ffb833a9e6ddbecd45364b194307f88578d31d88af2af857ce6d365fa9-merged.mount: Deactivated successfully. Feb 1 04:43:00 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:43:00 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:43:00 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:43:00 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:43:00 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:00 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:00 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:43:00 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:43:00 localhost podman[289046]: Feb 1 04:43:00 localhost podman[289046]: 2026-02-01 09:43:00.215730847 +0000 UTC m=+0.074425507 container create 0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mendeleev, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:43:00 localhost systemd[1]: Started libpod-conmon-0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90.scope. Feb 1 04:43:00 localhost podman[289046]: 2026-02-01 09:43:00.185038554 +0000 UTC m=+0.043733254 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:43:00 localhost systemd[1]: Started libcrun container. Feb 1 04:43:00 localhost podman[289046]: 2026-02-01 09:43:00.297500838 +0000 UTC m=+0.156195498 container init 0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mendeleev, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:43:00 localhost podman[289046]: 2026-02-01 09:43:00.310877528 +0000 UTC m=+0.169572198 container start 0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mendeleev, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , RELEASE=main, description=Red Hat Ceph Storage 7, release=1764794109, io.openshift.tags=rhceph ceph, ceph=True, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Feb 1 04:43:00 localhost podman[289046]: 2026-02-01 09:43:00.31123113 +0000 UTC m=+0.169925840 container attach 0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mendeleev, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.tags=rhceph ceph, release=1764794109, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.) Feb 1 04:43:00 localhost optimistic_mendeleev[289061]: 167 167 Feb 1 04:43:00 localhost systemd[1]: libpod-0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90.scope: Deactivated successfully. Feb 1 04:43:00 localhost podman[289046]: 2026-02-01 09:43:00.314438138 +0000 UTC m=+0.173132848 container died 0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mendeleev, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, version=7, GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, GIT_CLEAN=True, release=1764794109) Feb 1 04:43:00 localhost podman[289067]: 2026-02-01 09:43:00.400579744 +0000 UTC m=+0.078289805 container remove 0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_mendeleev, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git) Feb 1 04:43:00 localhost systemd[1]: libpod-conmon-0d2543999186080548dac7fc7578bdc05cbb2ec66bc2c7a5f8c403573dee5a90.scope: Deactivated successfully. Feb 1 04:43:00 localhost systemd[1]: tmp-crun.P1qRhE.mount: Deactivated successfully. Feb 1 04:43:00 localhost systemd[1]: var-lib-containers-storage-overlay-a06d3664020a921bf270fb1ff2b0e19bdfd5a5ee53c43306f3694a11f3cbcc02-merged.mount: Deactivated successfully. Feb 1 04:43:01 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054535 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:01 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:43:01 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:43:01 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:01 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:01 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:43:01 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:43:01 localhost openstack_network_exporter[239441]: ERROR 09:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:43:01 localhost openstack_network_exporter[239441]: Feb 1 04:43:01 localhost openstack_network_exporter[239441]: ERROR 09:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:43:01 localhost openstack_network_exporter[239441]: Feb 1 04:43:01 localhost nova_compute[274651]: 2026-02-01 09:43:01.940 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:02 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:43:02 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:43:02 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:02 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:02 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:02 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:43:03 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:43:03 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:43:03 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:03 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:03 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:03 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:03 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.527 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.528 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.560 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.561 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69e02e2d-58ef-45ee-9118-82540439546a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.528386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6765bf7a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'ed28e066463ad846a29d7da94d17f58093b983e86eee77759d10e2a27b1f9b81'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.528386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6765d596-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'e7119d4b3032ade3381cee48b76be757887df8832fccfb015455e69146595689'}]}, 'timestamp': '2026-02-01 09:43:03.561595', '_unique_id': '1d9ca046ae7740a59528726b040b760f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.564 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.564 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.565 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d25d899-b5e7-4e01-ace0-ece1285d8c30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.564776', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6766686c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': '2dd97b253d0d0c4b12d0cf92bf8051368755e91611947a8133670b00d998c520'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.564776', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '67667cb2-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'df041ab5be773fac8564e7da2a37e3d47403d4ae0a4dfd45ca1884c2d60962c2'}]}, 'timestamp': '2026-02-01 09:43:03.565883', '_unique_id': '1473edf2e154432486d8706cb6ef5206'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.568 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61e56dbe-053f-41ce-9d00-5bcfb0957576', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.568298', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6768dd40-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.787757503, 'message_signature': 'ca5be447a4247bb3c081476f7dbda6170387a6426a4d541afec84901dfe990b5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.568298', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6768f6a4-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.787757503, 'message_signature': 'ed5f6c2a82bb5f21e6f03f541418abd4608bf387526941e0b0f2ba42a0f79819'}]}, 'timestamp': '2026-02-01 09:43:03.582163', '_unique_id': '1e5503e785ba4e8b8719ee4108ca9756'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.588 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14671e7a-8720-4fb3-b98c-fcc502abeb81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.584908', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '676a134a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': '2f56543ccd839198886928c8fa6c08dc3604b5a70dd711875687a0a892dcbeee'}]}, 'timestamp': '2026-02-01 09:43:03.589484', '_unique_id': 'deb421c8973247fba3bdc64f55301ed4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.591 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.608 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 11770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e405a99-fabb-4366-bad9-55762ab1c543', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11770000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:43:03.591946', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '676d251c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.828157225, 'message_signature': 'acdd1d1f7d87891872c6e63c0bc8597d13c470d43f66dee424a84a1b19e929de'}]}, 'timestamp': '2026-02-01 09:43:03.609557', '_unique_id': '6072db85459c4168abe4e53c58ed4d95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.612 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.612 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c2588db-113f-4159-9eaf-20df110fa8a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.612165', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '676d9f38-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': 'f4e5730a41aa7e7fc057f60efa0304553559aeb57a1be35944e6c90caef687c5'}]}, 'timestamp': '2026-02-01 09:43:03.612661', '_unique_id': '67278be0f276487cbb17d712f9aa8694'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.614 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bca62038-aaef-40ba-8d32-3d7875b1ed9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.614827', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '676e091e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': 'b29cc85201b4539bd93ed3c60ebe647d293740369636ebef3c9e121514307672'}]}, 'timestamp': '2026-02-01 09:43:03.615347', '_unique_id': 'a4dd96a6194e45378e0468e65db14038'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '354a88ea-9eca-4da5-bb39-dde7ac6a5bed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.617710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '676e77a0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'ea3590025d0003afe02ab1e206583128a7ddf2f43b8a048773c383a935a61264'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.617710', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '676e88e4-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'e94f6f76bd7e2bbac95f13e14926c9eb4488026adac74eeb5dd637bac95c364a'}]}, 'timestamp': '2026-02-01 09:43:03.618586', '_unique_id': 'a08640fbfea340d6bcf0b4efb1db7a3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cce2433c-a023-4349-ad5c-6be9a85b046c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:43:03.620961', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '676ef7c0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.828157225, 'message_signature': '0e61549c786d3b82dd5d36b77bf53f9681fb0611980b5eccc2ad953a712446a5'}]}, 'timestamp': '2026-02-01 09:43:03.621438', '_unique_id': 'a5fe31c1d43a4683a64e928dbec7c873'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.623 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.624 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.624 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82ef2bba-3aea-4127-9224-9b3e4b754e24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.624050', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '676f6f84-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.787757503, 'message_signature': '619fdbbd70a455e12955399f569967239bbf87db3618a7dd4647eab03d61ade3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.624050', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '676f7f56-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.787757503, 'message_signature': '94dc480202a4c5e23f62247527073c41d6377444f8d2f28b615a90fd2f4647a6'}]}, 'timestamp': '2026-02-01 09:43:03.624894', '_unique_id': '520c57a56c93436c98acf79b234b7586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.627 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.627 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71c37e54-765c-478f-a872-6c0671b84968', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.627070', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '676fe554-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'b58b2862c0a62529e5fe540751d6a4e2fd3499bb0e0cf8291ac64a951f5ec55b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.627070', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '676ff800-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': '3c9e13bd387b792ada83e9d1e07ac144800e58c38ec6079433b96ffd9298f9fc'}]}, 'timestamp': '2026-02-01 09:43:03.627982', '_unique_id': 'd4c451602e214d36a44f750dcaf2ad57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2a6d5d6-5027-414f-835d-1e4c27036824', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.630119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '67705c3c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.787757503, 'message_signature': '964286f205a5afc40338fed6aa9e5ea98759a3e27526b842d43a3419c3e338d0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.630119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '67706c0e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.787757503, 'message_signature': '67e67932bde324963d9db3cbaf61b374cfa3067a587b7b50552666e436119740'}]}, 'timestamp': '2026-02-01 09:43:03.631075', '_unique_id': 'f96e0a8b72114c7a9568a79ded7791b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.633 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.633 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.633 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ab2aabf-d967-4930-804a-fa92c534316c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.633184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6770d64e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'ce8265551bd139639f1b8e55a5b5afffbb5da11600f1dadd63b756c69caad610'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.633184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6770f340-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': '6b5da9cffda9c4a17fb7f5290785c1f158f5673dbf8cf4e54cd8c56346439871'}]}, 'timestamp': '2026-02-01 09:43:03.634433', '_unique_id': '2a34ab734d484eac8cffd0f60ff91314'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb9a7cb2-5eb1-442d-bcbc-000fa3d371c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.636045', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '67714354-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': 'd2888f0aaf543481965a66a343f349e5b37240315665ffd01303a4d91dc314f9'}]}, 'timestamp': '2026-02-01 09:43:03.636464', '_unique_id': '262a1d8a0496480bbc119295af2e0c7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.637 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '179db00f-1b79-40ed-81f7-19fc9de37a0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.637934', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '67718e22-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': '6204c58c8aecd81e352314ad10a4eb8105708072aa61af4ff2f457c437183a5d'}]}, 'timestamp': '2026-02-01 09:43:03.638363', '_unique_id': '93ae6267c2e54ee9b840ffb91207a20c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd578d7f4-e80a-4c81-b7ba-c19fe0471a1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.639812', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '6771d6c0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': 'e73e77074f12b37d8eaaea0f0fd54d78a190e48c68249563193b91f3823717ff'}]}, 'timestamp': '2026-02-01 09:43:03.640229', '_unique_id': 'a91bff1d6ece47e2b141a48fd98c7f91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.641 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48fc1c47-65e7-4341-b815-db74ee74db97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.641704', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '677220da-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': '04e6ae901e4b6623f82c7b6212b52b1a8f64ca67ccc60f93da3b2a4a8b28f883'}]}, 'timestamp': '2026-02-01 09:43:03.642136', '_unique_id': 'd0d3e0d347684900b6844e0fed3d1c1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.643 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8054c02-abf9-423b-a5c2-2617e40acf67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:43:03.643613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6772687e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'f0f4ca539e637220a5ac9987b3a40e6c6b465b80eefe08d4f2c5b1b977529f3c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:43:03.643613', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6772740e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.747839317, 'message_signature': 'cf0da7bef03d3b68b3fec0aa3b74fa0cb00617ca5e39239b55ce7c7b444281f5'}]}, 'timestamp': '2026-02-01 09:43:03.644201', '_unique_id': '4266051dad4648a49e127c584fda9d78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.645 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '891cabd2-c002-491d-8f7a-f3dd4f5ec5b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.645645', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '6772b810-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': 'a15fdb19fba6c86fa13cd45dfbf3c62a0d58e1f426c6698a8160ddda73f507fe'}]}, 'timestamp': '2026-02-01 09:43:03.645956', '_unique_id': '936ede00484948a481b520baced20032'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.647 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.647 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.647 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.647 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a0d5e65-bb4a-4798-954d-cd7432ff9506', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.647560', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '6773028e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': 'abef8d3729eb686c40cfe1d44ba895c6ad9f8ccd21d98d4d41bc88741fc121b8'}]}, 'timestamp': '2026-02-01 09:43:03.647864', '_unique_id': 'f4274d268b87473f872c8e04c061cc8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.649 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25d28dc0-34c6-468b-bfd2-7209def88b0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:43:03.649264', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '67734528-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11077.804415195, 'message_signature': '6d6c1e119652c36b86060bd6c25cf5169a6bcc8aa3e69f942356f654e9f2f36d'}]}, 'timestamp': '2026-02-01 09:43:03.649569', '_unique_id': 'd1ec4e1d1ce54f8ea149aaa5bd617003'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:43:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:43:03.650 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:43:04 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:43:04 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:43:04 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:04 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:04 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:04 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:04 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:43:04 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:43:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:43:04 localhost podman[289084]: 2026-02-01 09:43:04.731116918 +0000 UTC m=+0.085637531 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:43:04 localhost podman[289084]: 2026-02-01 09:43:04.770377603 +0000 UTC m=+0.124898247 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:43:04 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:43:05 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:43:05 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:43:05 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:05 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:05 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:43:05 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:43:05 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:43:05 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:43:06 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054727 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:06 localhost ceph-mon[286721]: Saving service mon spec with placement label:mon Feb 1 04:43:06 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:06 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:06 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:06 localhost ceph-mon[286721]: Reconfiguring mon.np0005604213 (monmap changed)... Feb 1 04:43:06 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:43:06 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:43:06 localhost nova_compute[274651]: 2026-02-01 09:43:06.944 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:43:07 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:07 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:07 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:43:07 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:43:07 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:43:07 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:43:07 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:07 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:07 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:43:07 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:43:07 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:43:07 localhost sshd[289103]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:43:09 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:09 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:09 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:09 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:09 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:43:10 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:43:10 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:43:10 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:10 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:10 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:10 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:10 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:43:10 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:43:10 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:11 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:11 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:43:11 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:43:11 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:11 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:43:11 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:43:11 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:43:11 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:43:11 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:11 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:11 localhost ceph-mon[286721]: Reconfiguring mon.np0005604215 (monmap changed)... Feb 1 04:43:11 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:43:11 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain Feb 1 04:43:11 localhost nova_compute[274651]: 2026-02-01 09:43:11.947 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:43:13 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:13 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:13 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:43:13 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:13 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:13 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:43:13 localhost systemd[1]: tmp-crun.Z8EhnZ.mount: Deactivated successfully. Feb 1 04:43:13 localhost podman[289123]: 2026-02-01 09:43:13.125624171 +0000 UTC m=+0.069241887 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:43:13 localhost podman[289123]: 2026-02-01 09:43:13.137321181 +0000 UTC m=+0.080938947 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:43:13 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:43:14 localhost ceph-mon[286721]: Reconfiguring mon.np0005604210 (monmap changed)... Feb 1 04:43:14 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain Feb 1 04:43:14 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:14 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:14 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:43:14 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:14 localhost podman[289199]: Feb 1 04:43:14 localhost podman[289199]: 2026-02-01 09:43:14.838183691 +0000 UTC m=+0.073442517 container create ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_clarke, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, release=1764794109, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:43:14 localhost systemd[1]: Started libpod-conmon-ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef.scope. Feb 1 04:43:14 localhost systemd[1]: Started libcrun container. Feb 1 04:43:14 localhost podman[289199]: 2026-02-01 09:43:14.912203904 +0000 UTC m=+0.147462740 container init ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_clarke, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Feb 1 04:43:14 localhost podman[289199]: 2026-02-01 09:43:14.81506203 +0000 UTC m=+0.050320846 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:43:14 localhost podman[289199]: 2026-02-01 09:43:14.920759857 +0000 UTC m=+0.156018683 container start ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_clarke, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main) Feb 1 04:43:14 localhost podman[289199]: 2026-02-01 09:43:14.921150719 +0000 UTC m=+0.156409575 container attach ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_clarke, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1764794109, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7) Feb 1 04:43:14 localhost nostalgic_clarke[289214]: 167 167 Feb 1 04:43:14 localhost systemd[1]: libpod-ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef.scope: Deactivated successfully. Feb 1 04:43:14 localhost podman[289199]: 2026-02-01 09:43:14.925130101 +0000 UTC m=+0.160388957 container died ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_clarke, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, version=7) Feb 1 04:43:15 localhost podman[289219]: 2026-02-01 09:43:15.014960469 +0000 UTC m=+0.077890502 container remove ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_clarke, name=rhceph, version=7, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1764794109, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:43:15 localhost systemd[1]: libpod-conmon-ef749a4c3461387385299e939087a1344f23df2ed692f2f723f5fd32b4d20eef.scope: Deactivated successfully. Feb 1 04:43:15 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 e84: 6 total, 6 up, 6 in Feb 1 04:43:15 localhost ceph-mon[286721]: Reconfiguring mon.np0005604211 (monmap changed)... Feb 1 04:43:15 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain Feb 1 04:43:15 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:15 localhost ceph-mon[286721]: Reconfiguring mon.np0005604212 (monmap changed)... Feb 1 04:43:15 localhost ceph-mon[286721]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:43:15 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:43:15 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/474945783' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:43:15 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:43:15 localhost ceph-mon[286721]: Activating manager daemon np0005604209.isqrps Feb 1 04:43:15 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:43:15 localhost ceph-mon[286721]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' Feb 1 04:43:15 localhost systemd[1]: session-67.scope: Deactivated successfully. Feb 1 04:43:15 localhost systemd[1]: session-67.scope: Consumed 7.951s CPU time. Feb 1 04:43:15 localhost systemd-logind[759]: Session 67 logged out. Waiting for processes to exit. Feb 1 04:43:15 localhost systemd-logind[759]: Removed session 67. Feb 1 04:43:15 localhost sshd[289235]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:43:15 localhost systemd-logind[759]: New session 68 of user ceph-admin. Feb 1 04:43:15 localhost systemd[1]: Started Session 68 of User ceph-admin. Feb 1 04:43:15 localhost systemd[1]: session-65.scope: Deactivated successfully. Feb 1 04:43:15 localhost systemd[1]: session-65.scope: Consumed 1.597s CPU time. Feb 1 04:43:15 localhost systemd-logind[759]: Session 65 logged out. Waiting for processes to exit. Feb 1 04:43:15 localhost systemd-logind[759]: Removed session 65. Feb 1 04:43:15 localhost systemd[1]: var-lib-containers-storage-overlay-3a8fdf56e70500d7dd746f8aa39de6c7253528c490701ece5ac9cd44f9513d71-merged.mount: Deactivated successfully. Feb 1 04:43:16 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:16 localhost ceph-mon[286721]: Manager daemon np0005604209.isqrps is now available Feb 1 04:43:16 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch Feb 1 04:43:16 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch Feb 1 04:43:16 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch Feb 1 04:43:16 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.297820) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996298014, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11734, "num_deletes": 258, "total_data_size": 22773127, "memory_usage": 23826000, "flush_reason": "Manual Compaction"} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996423098, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 16971154, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11739, "table_properties": {"data_size": 16908727, "index_size": 35505, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25861, "raw_key_size": 272779, "raw_average_key_size": 26, "raw_value_size": 16729028, "raw_average_value_size": 1619, "num_data_blocks": 1377, "num_entries": 10329, "num_filter_entries": 10329, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 1769938956, "file_creation_time": 1769938996, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 125338 microseconds, and 43839 cpu microseconds. Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.423176) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 16971154 bytes OK Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.423206) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.428456) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.428485) EVENT_LOG_v1 {"time_micros": 1769938996428476, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.428509) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 22695615, prev total WAL file size 22695615, number of live WAL files 2. Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.432354) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end) Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(16MB) 8(2012B)] Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996432442, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 16973166, "oldest_snapshot_seqno": -1} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10077 keys, 16967743 bytes, temperature: kUnknown Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996517724, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 16967743, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16906021, "index_size": 35445, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25221, "raw_key_size": 267896, "raw_average_key_size": 26, "raw_value_size": 16729683, "raw_average_value_size": 1660, "num_data_blocks": 1376, "num_entries": 10077, "num_filter_entries": 10077, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769938996, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.518041) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 16967743 bytes Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.519638) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 198.7 rd, 198.7 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(16.2, 0.0 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10334, records dropped: 257 output_compression: NoCompression Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.519658) EVENT_LOG_v1 {"time_micros": 1769938996519649, "job": 4, "event": "compaction_finished", "compaction_time_micros": 85401, "compaction_time_cpu_micros": 26116, "output_level": 6, "num_output_files": 1, "total_output_size": 16967743, "num_input_records": 10334, "num_output_records": 10077, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996521338, "job": 4, "event": "table_file_deletion", "file_number": 14} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996521375, "job": 4, "event": "table_file_deletion", "file_number": 8} Feb 1 04:43:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:16.432241) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:43:16 localhost systemd[1]: tmp-crun.dAzZsi.mount: Deactivated successfully. Feb 1 04:43:16 localhost podman[289347]: 2026-02-01 09:43:16.674451106 +0000 UTC m=+0.091790940 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, RELEASE=main, name=rhceph, version=7, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:43:16 localhost podman[289347]: 2026-02-01 09:43:16.795263367 +0000 UTC m=+0.212603191 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:43:16 localhost nova_compute[274651]: 2026-02-01 09:43:16.948 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:16 localhost nova_compute[274651]: 2026-02-01 09:43:16.953 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:17 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:17 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: [01/Feb/2026:09:43:17] ENGINE Bus STARTING Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:43:18 localhost systemd[1]: tmp-crun.asFWHW.mount: Deactivated successfully. Feb 1 04:43:18 localhost podman[289570]: 2026-02-01 09:43:18.581087535 +0000 UTC m=+0.115003413 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:43:18 localhost podman[289570]: 2026-02-01 09:43:18.588794882 +0000 UTC m=+0.122710790 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Feb 1 04:43:18 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:43:19 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:43:19 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:43:19 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:43:19 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:43:19 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:43:19 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:43:19 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:43:19 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:43:19 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:43:20 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:43:20 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:43:20 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:43:20 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:43:20 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:43:21 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:43:21 localhost podman[290157]: 2026-02-01 09:43:21.519807382 +0000 UTC m=+0.085442775 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:43:21 localhost podman[290157]: 2026-02-01 09:43:21.554725005 +0000 UTC m=+0.120360378 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:43:21 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:43:21 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:43:21 localhost podman[290195]: 2026-02-01 09:43:21.612916722 +0000 UTC m=+0.079731510 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS) Feb 1 04:43:21 localhost podman[290195]: 2026-02-01 09:43:21.681899471 +0000 UTC m=+0.148714339 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Feb 1 04:43:21 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:43:21 localhost nova_compute[274651]: 2026-02-01 09:43:21.951 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:21 localhost nova_compute[274651]: 2026-02-01 09:43:21.954 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.157143) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002157198, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 653, "num_deletes": 256, "total_data_size": 2376734, "memory_usage": 2410840, "flush_reason": "Manual Compaction"} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002171092, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1502961, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11744, "largest_seqno": 12392, "table_properties": {"data_size": 1499505, "index_size": 1311, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8247, "raw_average_key_size": 19, "raw_value_size": 1492156, "raw_average_value_size": 3470, "num_data_blocks": 51, "num_entries": 430, "num_filter_entries": 430, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938997, "oldest_key_time": 1769938997, "file_creation_time": 1769939002, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 14016 microseconds, and 5829 cpu microseconds. Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.171157) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1502961 bytes OK Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.171188) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.173276) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.173308) EVENT_LOG_v1 {"time_micros": 1769939002173299, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.173335) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 2372903, prev total WAL file size 2372903, number of live WAL files 2. Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.174841) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303034' seq:72057594037927935, type:22 .. '6B760031323631' seq:0, type:0; will stop at (end) Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1467KB)], [15(16MB)] Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002174897, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 18470704, "oldest_snapshot_seqno": -1} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 9969 keys, 17441759 bytes, temperature: kUnknown Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002345554, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 17441759, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17381659, "index_size": 34079, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 267347, "raw_average_key_size": 26, "raw_value_size": 17207915, "raw_average_value_size": 1726, "num_data_blocks": 1299, "num_entries": 9969, "num_filter_entries": 9969, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939002, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.345899) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 17441759 bytes Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.349954) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 108.2 rd, 102.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 16.2 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(23.9) write-amplify(11.6) OK, records in: 10507, records dropped: 538 output_compression: NoCompression Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.350011) EVENT_LOG_v1 {"time_micros": 1769939002349971, "job": 6, "event": "compaction_finished", "compaction_time_micros": 170766, "compaction_time_cpu_micros": 57349, "output_level": 6, "num_output_files": 1, "total_output_size": 17441759, "num_input_records": 10507, "num_output_records": 9969, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002350443, "job": 6, "event": "table_file_deletion", "file_number": 17} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002352869, "job": 6, "event": "table_file_deletion", "file_number": 15} Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.174722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.353021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.353025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.353026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.353027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:43:22 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:43:22.353029) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:43:22 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:43:22 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:22 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:43:23 localhost podman[290365]: Feb 1 04:43:23 localhost podman[290365]: 2026-02-01 09:43:23.069702556 +0000 UTC m=+0.078406640 container create eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_dijkstra, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, release=1764794109, io.buildah.version=1.41.4, GIT_BRANCH=main, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=) Feb 1 04:43:23 localhost systemd[1]: Started libpod-conmon-eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf.scope. Feb 1 04:43:23 localhost systemd[1]: Started libcrun container. Feb 1 04:43:23 localhost podman[290365]: 2026-02-01 09:43:23.038058383 +0000 UTC m=+0.046762487 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:43:23 localhost podman[290365]: 2026-02-01 09:43:23.151011493 +0000 UTC m=+0.159715577 container init eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_dijkstra, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Feb 1 04:43:23 localhost podman[290365]: 2026-02-01 09:43:23.164362822 +0000 UTC m=+0.173066906 container start eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_dijkstra, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, release=1764794109, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_CLEAN=True, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph) Feb 1 04:43:23 localhost podman[290365]: 2026-02-01 09:43:23.164663551 +0000 UTC m=+0.173367685 container attach eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_dijkstra, GIT_BRANCH=main, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git) Feb 1 04:43:23 localhost hardcore_dijkstra[290380]: 167 167 Feb 1 04:43:23 localhost systemd[1]: libpod-eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf.scope: Deactivated successfully. Feb 1 04:43:23 localhost podman[290365]: 2026-02-01 09:43:23.169445468 +0000 UTC m=+0.178149602 container died eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_dijkstra, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , release=1764794109, distribution-scope=public, build-date=2025-12-08T17:28:53Z, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Feb 1 04:43:23 localhost podman[290385]: 2026-02-01 09:43:23.262843368 +0000 UTC m=+0.081109043 container remove eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hardcore_dijkstra, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , release=1764794109, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:43:23 localhost systemd[1]: libpod-conmon-eb50800ae10ed0fc8fd954e6ca7344cb387e9d50c6c075bc220f27df4d61ddcf.scope: Deactivated successfully. Feb 1 04:43:23 localhost ceph-mon[286721]: Reconfiguring mon.np0005604212 (monmap changed)... Feb 1 04:43:23 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:43:23 localhost ceph-mon[286721]: [01/Feb/2026:09:43:22] ENGINE Error in 'start' listener >#012Traceback (most recent call last):#012 File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish#012 output.append(listener(*args, **kwargs))#012 File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start#012 super(Server, self).start()#012 File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start#012 self.wait()#012 File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait#012 portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)#012 File "/lib/python3.9/site-packages/portend.py", line 162, in occupied#012 raise Timeout("Port {port} not bound on {host}.".format(**locals()))#012portend.Timeout: Port 8765 not bound on 172.18.0.103. Feb 1 04:43:23 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:23 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:23 localhost ceph-mon[286721]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:43:23 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:23 localhost podman[236886]: time="2026-02-01T09:43:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:43:23 localhost podman[236886]: @ - - [01/Feb/2026:09:43:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:43:24 localhost podman[236886]: @ - - [01/Feb/2026:09:43:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18782 "" "Go-http-client/1.1" Feb 1 04:43:24 localhost systemd[1]: var-lib-containers-storage-overlay-7d90fad39c33282f41d7d53dfa5593c951001e8127110d8e3eb8e2cc821ee91f-merged.mount: Deactivated successfully. Feb 1 04:43:25 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 1 04:43:25 localhost systemd[285139]: Activating special unit Exit the Session... Feb 1 04:43:25 localhost systemd[285139]: Stopped target Main User Target. Feb 1 04:43:25 localhost systemd[285139]: Stopped target Basic System. Feb 1 04:43:25 localhost systemd[285139]: Stopped target Paths. Feb 1 04:43:25 localhost systemd[285139]: Stopped target Sockets. Feb 1 04:43:25 localhost systemd[285139]: Stopped target Timers. Feb 1 04:43:25 localhost systemd[285139]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 1 04:43:25 localhost systemd[285139]: Stopped Daily Cleanup of User's Temporary Directories. Feb 1 04:43:25 localhost systemd[285139]: Closed D-Bus User Message Bus Socket. Feb 1 04:43:25 localhost systemd[285139]: Stopped Create User's Volatile Files and Directories. Feb 1 04:43:25 localhost systemd[285139]: Removed slice User Application Slice. Feb 1 04:43:25 localhost systemd[285139]: Reached target Shutdown. Feb 1 04:43:25 localhost systemd[285139]: Finished Exit the Session. Feb 1 04:43:25 localhost systemd[285139]: Reached target Exit the Session. Feb 1 04:43:25 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 1 04:43:25 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 1 04:43:25 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 1 04:43:25 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 1 04:43:25 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 1 04:43:25 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 1 04:43:25 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 1 04:43:25 localhost systemd[1]: user-1003.slice: Consumed 2.227s CPU time. Feb 1 04:43:26 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:26 localhost ceph-mon[286721]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' Feb 1 04:43:26 localhost nova_compute[274651]: 2026-02-01 09:43:26.956 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE Error in 'start' listener >#012Traceback (most recent call last):#012 File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish#012 output.append(listener(*args, **kwargs))#012 File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start#012 super(Server, self).start()#012 File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start#012 self.wait()#012 File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait#012 portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)#012 File "/lib/python3.9/site-packages/portend.py", line 162, in occupied#012 raise Timeout("Port {port} not bound on {host}.".format(**locals()))#012portend.Timeout: Port 7150 not bound on 172.18.0.103. Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE Shutting down due to error in start listener:#012Traceback (most recent call last):#012 File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 268, in start#012 self.publish('start')#012 File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 248, in publish#012 raise exc#012cherrypy.process.wspbus.ChannelFailures: Timeout('Port 8765 not bound on 172.18.0.103.')#012Timeout('Port 7150 not bound on 172.18.0.103.') Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE Bus STOPPING Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 8765)) already shut down Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 7150)) already shut down Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE Bus STOPPED Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE Bus EXITING Feb 1 04:43:28 localhost ceph-mon[286721]: [01/Feb/2026:09:43:27] ENGINE Bus EXITED Feb 1 04:43:28 localhost ceph-mon[286721]: Failed to run cephadm http server: Timeout('Port 8765 not bound on 172.18.0.103.')#012Timeout('Port 7150 not bound on 172.18.0.103.') Feb 1 04:43:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:43:29 localhost podman[290421]: 2026-02-01 09:43:29.729123487 +0000 UTC m=+0.086574030 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, distribution-scope=public, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 04:43:29 localhost podman[290421]: 2026-02-01 09:43:29.741649171 +0000 UTC m=+0.099099714 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, architecture=x86_64, config_id=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:43:29 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:43:31 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:31 localhost openstack_network_exporter[239441]: ERROR 09:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:43:31 localhost openstack_network_exporter[239441]: Feb 1 04:43:31 localhost openstack_network_exporter[239441]: ERROR 09:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:43:31 localhost openstack_network_exporter[239441]: Feb 1 04:43:31 localhost nova_compute[274651]: 2026-02-01 09:43:31.957 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:31 localhost nova_compute[274651]: 2026-02-01 09:43:31.961 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:43:35 localhost podman[290442]: 2026-02-01 09:43:35.731253621 +0000 UTC m=+0.080096861 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:43:35 localhost podman[290442]: 2026-02-01 09:43:35.743458486 +0000 UTC m=+0.092301696 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:43:35 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:43:36 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:36 localhost nova_compute[274651]: 2026-02-01 09:43:36.961 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:40 localhost nova_compute[274651]: 2026-02-01 09:43:40.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:40 localhost nova_compute[274651]: 2026-02-01 09:43:40.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:40 localhost nova_compute[274651]: 2026-02-01 09:43:40.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:43:41 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.295 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.296 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:43:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:43:41.706 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:43:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:43:41.707 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:43:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:43:41.707 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:43:41 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:43:41 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1821239121' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.761 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.954 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.954 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:43:41 localhost nova_compute[274651]: 2026-02-01 09:43:41.963 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.203 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.205 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11578MB free_disk=0.0GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.205 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.206 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.301 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.301 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.302 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=0GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.338 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:43:42 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:43:42 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2047471593' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.804 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.812 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.893 274655 ERROR nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [req-85e9033d-4dee-45b4-8dd4-d6dcd92b90c5] Failed to update inventory to [{'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}}] for resource provider with UUID a04bda90-8ccd-4104-8518-038544ff1327. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n update conflict: Inventory for 'DISK_GB' on resource provider 'a04bda90-8ccd-4104-8518-038544ff1327' in use. ", "code": "placement.inventory.inuse", "request_id": "req-85e9033d-4dee-45b4-8dd4-d6dcd92b90c5"}]}#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.894 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.688s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Error updating PCI resources for node np0005604212.localdomain.: nova.exception.PlacementPciException: Failed to gather or report PCI resources to Placement: There was a conflict when trying to complete your request. Feb 1 04:43:42 localhost nova_compute[274651]: Feb 1 04:43:42 localhost nova_compute[274651]: update conflict: Inventory for 'DISK_GB' on resource provider 'a04bda90-8ccd-4104-8518-038544ff1327' in use. Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager Traceback (most recent call last): Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1288, in _update_to_placement Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager self.reportclient.update_from_provider_tree( Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 1484, in update_from_provider_tree Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager self.set_inventory_for_provider( Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/scheduler/client/report.py", line 987, in set_inventory_for_provider Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager raise exception.InventoryInUse(err['detail']) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager nova.exception.InventoryInUse: There was a conflict when trying to complete your request. Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager update conflict: Inventory for 'DISK_GB' on resource provider 'a04bda90-8ccd-4104-8518-038544ff1327' in use. Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager During handling of the above exception, another exception occurred: Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager Traceback (most recent call last): Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 10513, in _update_available_resource_for_node Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager self.rt.update_available_resource(context, nodename, Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 889, in update_available_resource Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager self._update_available_resource(context, resources, startup=startup) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py", line 414, in inner Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager return f(*args, **kwargs) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 994, in _update_available_resource Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager self._update(context, cn, startup=startup) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1303, in _update Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager self._update_to_placement(context, compute_node, startup) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 49, in wrapped_f Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager return Retrying(*dargs, **dkw).call(f, *args, **kw) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 206, in call Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager return attempt.get(self._wrap_exception) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 247, in get Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager six.reraise(self.value[0], self.value[1], self.value[2]) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager raise value Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/retrying.py", line 200, in call Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager attempt = Attempt(fn(*args, **kwargs), attempt_number, False) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager File "/usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py", line 1298, in _update_to_placement Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager raise exception.PlacementPciException(error=str(e)) Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager nova.exception.PlacementPciException: Failed to gather or report PCI resources to Placement: There was a conflict when trying to complete your request. Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager update conflict: Inventory for 'DISK_GB' on resource provider 'a04bda90-8ccd-4104-8518-038544ff1327' in use. Feb 1 04:43:42 localhost nova_compute[274651]: 2026-02-01 09:43:42.895 274655 ERROR nova.compute.manager #033[00m Feb 1 04:43:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:43:43 localhost podman[290506]: 2026-02-01 09:43:43.714253826 +0000 UTC m=+0.073935042 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:43:43 localhost podman[290506]: 2026-02-01 09:43:43.752512731 +0000 UTC m=+0.112193957 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:43:43 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:43:44 localhost nova_compute[274651]: 2026-02-01 09:43:44.898 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:44 localhost nova_compute[274651]: 2026-02-01 09:43:44.925 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:44 localhost nova_compute[274651]: 2026-02-01 09:43:44.926 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:43:44 localhost nova_compute[274651]: 2026-02-01 09:43:44.927 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.010 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.010 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.011 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.011 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:43:46 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.794 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.808 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.808 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.809 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.809 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:43:46 localhost nova_compute[274651]: 2026-02-01 09:43:46.965 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:43:48 localhost podman[290530]: 2026-02-01 09:43:48.707420102 +0000 UTC m=+0.068685470 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Feb 1 04:43:48 localhost podman[290530]: 2026-02-01 09:43:48.7167808 +0000 UTC m=+0.078046158 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 04:43:48 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:43:51 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:43:51 localhost podman[290548]: 2026-02-01 09:43:51.703931073 +0000 UTC m=+0.069583538 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:43:51 localhost podman[290548]: 2026-02-01 09:43:51.712025452 +0000 UTC m=+0.077677957 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:43:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:43:51 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:43:51 localhost podman[290570]: 2026-02-01 09:43:51.818132551 +0000 UTC m=+0.072538959 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127) Feb 1 04:43:51 localhost podman[290570]: 2026-02-01 09:43:51.87343713 +0000 UTC m=+0.127843558 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller) Feb 1 04:43:51 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:43:51 localhost nova_compute[274651]: 2026-02-01 09:43:51.968 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:43:53 localhost podman[236886]: time="2026-02-01T09:43:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:43:53 localhost podman[236886]: @ - - [01/Feb/2026:09:43:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:43:54 localhost podman[236886]: @ - - [01/Feb/2026:09:43:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18780 "" "Go-http-client/1.1" Feb 1 04:43:56 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:43:56 localhost nova_compute[274651]: 2026-02-01 09:43:56.971 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:44:00 localhost podman[290595]: 2026-02-01 09:44:00.701623271 +0000 UTC m=+0.060912472 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, version=9.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1769056855, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc.) Feb 1 04:44:00 localhost podman[290595]: 2026-02-01 09:44:00.741383212 +0000 UTC m=+0.100672413 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, distribution-scope=public) Feb 1 04:44:00 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:44:01 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:01 localhost openstack_network_exporter[239441]: ERROR 09:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:44:01 localhost openstack_network_exporter[239441]: Feb 1 04:44:01 localhost openstack_network_exporter[239441]: ERROR 09:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:44:01 localhost openstack_network_exporter[239441]: Feb 1 04:44:01 localhost nova_compute[274651]: 2026-02-01 09:44:01.973 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:44:04 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e85 e85: 6 total, 6 up, 6 in Feb 1 04:44:04 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/534665898' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:44:04 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:44:04 localhost ceph-mon[286721]: Activating manager daemon np0005604210.rirrtk Feb 1 04:44:04 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:44:04 localhost systemd[1]: session-68.scope: Deactivated successfully. Feb 1 04:44:04 localhost systemd[1]: session-68.scope: Consumed 6.452s CPU time. Feb 1 04:44:04 localhost systemd-logind[759]: Session 68 logged out. Waiting for processes to exit. Feb 1 04:44:04 localhost systemd-logind[759]: Removed session 68. Feb 1 04:44:04 localhost sshd[290616]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:44:04 localhost systemd-logind[759]: New session 69 of user ceph-admin. Feb 1 04:44:04 localhost systemd[1]: Started Session 69 of User ceph-admin. Feb 1 04:44:05 localhost ceph-mon[286721]: Manager daemon np0005604210.rirrtk is now available Feb 1 04:44:05 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604210.rirrtk/mirror_snapshot_schedule"} : dispatch Feb 1 04:44:05 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604210.rirrtk/trash_purge_schedule"} : dispatch Feb 1 04:44:05 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:05 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:05 localhost podman[290729]: 2026-02-01 09:44:05.747586579 +0000 UTC m=+0.090574913 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, name=rhceph, build-date=2025-12-08T17:28:53Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Feb 1 04:44:05 localhost podman[290729]: 2026-02-01 09:44:05.821857451 +0000 UTC m=+0.164845815 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, version=7, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, release=1764794109, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, ceph=True, distribution-scope=public, vcs-type=git) Feb 1 04:44:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:44:05 localhost podman[290758]: 2026-02-01 09:44:05.954889666 +0000 UTC m=+0.075520021 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 1 04:44:05 localhost podman[290758]: 2026-02-01 09:44:05.96510549 +0000 UTC m=+0.085735855 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 1 04:44:05 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:44:06 localhost ceph-mon[286721]: mon.np0005604212@4(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:06 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:06 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:06 localhost nova_compute[274651]: 2026-02-01 09:44:06.977 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:07 localhost ceph-mon[286721]: [01/Feb/2026:09:44:05] ENGINE Bus STARTING Feb 1 04:44:07 localhost ceph-mon[286721]: [01/Feb/2026:09:44:05] ENGINE Serving on https://172.18.0.104:7150 Feb 1 04:44:07 localhost ceph-mon[286721]: [01/Feb/2026:09:44:05] ENGINE Client ('172.18.0.104', 39488) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:44:07 localhost ceph-mon[286721]: [01/Feb/2026:09:44:05] ENGINE Serving on http://172.18.0.104:8765 Feb 1 04:44:07 localhost ceph-mon[286721]: [01/Feb/2026:09:44:05] ENGINE Bus STARTED Feb 1 04:44:07 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:44:07 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:44:07 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:07 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:44:08 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:44:08 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:44:08 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:44:08 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:44:08 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:08 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:08 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:08 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:08 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:09 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:09 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:09 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:09 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:09 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:09 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571d5747600 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Feb 1 04:44:09 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Feb 1 04:44:09 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Feb 1 04:44:09 localhost ceph-mon[286721]: mon.np0005604212@4(peon) e10 my rank is now 3 (was 4) Feb 1 04:44:09 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 1 04:44:09 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 1 04:44:09 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x5571ded70000 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Feb 1 04:44:09 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:44:09 localhost ceph-mon[286721]: paxos.3).electionLogic(44) init, last seen epoch 44 Feb 1 04:44:09 localhost ceph-mon[286721]: mon.np0005604212@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:44:11 localhost nova_compute[274651]: 2026-02-01 09:44:11.981 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:12 localhost ceph-mon[286721]: mon.np0005604212@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:44:12 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Remove daemons mon.np0005604210 Feb 1 04:44:13 localhost ceph-mon[286721]: Safe to remove mon.np0005604210: new quorum should be ['np0005604211', 'np0005604215', 'np0005604213', 'np0005604212'] (from ['np0005604211', 'np0005604215', 'np0005604213', 'np0005604212']) Feb 1 04:44:13 localhost ceph-mon[286721]: Removing monitor np0005604210 from monmap... Feb 1 04:44:13 localhost ceph-mon[286721]: Removing daemon mon.np0005604210 from np0005604210.localdomain -- ports [] Feb 1 04:44:13 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:44:13 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:44:13 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:44:13 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:44:13 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3) Feb 1 04:44:13 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:44:13 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:13 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:44:14 localhost systemd[1]: tmp-crun.GPxF3y.mount: Deactivated successfully. Feb 1 04:44:14 localhost podman[291633]: 2026-02-01 09:44:14.750045704 +0000 UTC m=+0.108280627 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:44:14 localhost podman[291633]: 2026-02-01 09:44:14.757619226 +0000 UTC m=+0.115854179 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:44:14 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:44:14 localhost ceph-mon[286721]: Deploying daemon mon.np0005604210 on np0005604210.localdomain Feb 1 04:44:14 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:14 localhost ceph-mon[286721]: Removed label mon from host np0005604210.localdomain Feb 1 04:44:15 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:44:15 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:44:15 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:15 localhost ceph-mon[286721]: Removed label mgr from host np0005604210.localdomain Feb 1 04:44:16 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:16 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:44:16 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:44:16 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:16 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:16 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:16 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:16 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:16 localhost nova_compute[274651]: 2026-02-01 09:44:16.984 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:17 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:44:17 localhost ceph-mon[286721]: Removed label _admin from host np0005604210.localdomain Feb 1 04:44:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:44:18 localhost podman[291710]: 2026-02-01 09:44:18.874427565 +0000 UTC m=+0.096501895 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:44:18 localhost podman[291710]: 2026-02-01 09:44:18.884327559 +0000 UTC m=+0.106401879 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:44:18 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:44:18 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:18 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:18 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:44:18 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:18 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:19 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:44:19 localhost ceph-mon[286721]: Removing np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: Removing np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:44:19 localhost ceph-mon[286721]: Removing np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:19 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:44:19 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:19 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:20 localhost ceph-mon[286721]: Safe to remove mon.np0005604210: not in monmap (['np0005604211', 'np0005604215', 'np0005604213', 'np0005604212']) Feb 1 04:44:20 localhost ceph-mon[286721]: Removing monitor np0005604210 from monmap... Feb 1 04:44:20 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon rm", "name": "np0005604210"} : dispatch Feb 1 04:44:20 localhost ceph-mon[286721]: Removing daemon mon.np0005604210 from np0005604210.localdomain -- ports [] Feb 1 04:44:21 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:44:21 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:21 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:44:21 localhost nova_compute[274651]: 2026-02-01 09:44:21.985 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:21 localhost nova_compute[274651]: 2026-02-01 09:44:21.988 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:44:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:44:22 localhost podman[292012]: 2026-02-01 09:44:22.720046196 +0000 UTC m=+0.079477442 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:44:22 localhost podman[292012]: 2026-02-01 09:44:22.734330305 +0000 UTC m=+0.093761541 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:44:22 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:44:22 localhost podman[292013]: 2026-02-01 09:44:22.778305525 +0000 UTC m=+0.134372038 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:44:22 localhost podman[292013]: 2026-02-01 09:44:22.869380742 +0000 UTC m=+0.225447245 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3) Feb 1 04:44:22 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:44:23 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:23 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:23 localhost podman[236886]: time="2026-02-01T09:44:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:44:23 localhost podman[236886]: @ - - [01/Feb/2026:09:44:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:44:24 localhost podman[236886]: @ - - [01/Feb/2026:09:44:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18774 "" "Go-http-client/1.1" Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:25 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:44:26 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:26 localhost ceph-mon[286721]: Reconfiguring crash.np0005604210 (monmap changed)... Feb 1 04:44:26 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain Feb 1 04:44:26 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:26 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:26 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:44:26 localhost nova_compute[274651]: 2026-02-01 09:44:26.987 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:26 localhost nova_compute[274651]: 2026-02-01 09:44:26.989 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:27 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)... Feb 1 04:44:27 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain Feb 1 04:44:27 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:27 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:27 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:27 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:27 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:44:28 localhost ceph-mon[286721]: Reconfiguring mon.np0005604211 (monmap changed)... Feb 1 04:44:28 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:28 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:44:29 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:44:29 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:44:29 localhost ceph-mon[286721]: Added label _no_schedule to host np0005604210.localdomain Feb 1 04:44:29 localhost ceph-mon[286721]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604210.localdomain Feb 1 04:44:29 localhost ceph-mon[286721]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:44:29 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:44:29 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:29 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:29 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:44:29 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:29 localhost podman[292150]: Feb 1 04:44:29 localhost podman[292150]: 2026-02-01 09:44:29.880232819 +0000 UTC m=+0.081039980 container create 2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_banach, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, ceph=True) Feb 1 04:44:29 localhost systemd[1]: Started libpod-conmon-2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3.scope. Feb 1 04:44:29 localhost systemd[1]: Started libcrun container. Feb 1 04:44:29 localhost podman[292150]: 2026-02-01 09:44:29.844550932 +0000 UTC m=+0.045358143 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:44:29 localhost podman[292150]: 2026-02-01 09:44:29.954840449 +0000 UTC m=+0.155647610 container init 2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_banach, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:44:29 localhost systemd[1]: tmp-crun.LzbFyp.mount: Deactivated successfully. Feb 1 04:44:29 localhost podman[292150]: 2026-02-01 09:44:29.969684946 +0000 UTC m=+0.170492137 container start 2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_banach, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, release=1764794109, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:44:29 localhost podman[292150]: 2026-02-01 09:44:29.970162071 +0000 UTC m=+0.170969262 container attach 2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_banach, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , name=rhceph, distribution-scope=public, GIT_CLEAN=True) Feb 1 04:44:29 localhost blissful_banach[292166]: 167 167 Feb 1 04:44:29 localhost systemd[1]: libpod-2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3.scope: Deactivated successfully. Feb 1 04:44:29 localhost podman[292150]: 2026-02-01 09:44:29.97372161 +0000 UTC m=+0.174528771 container died 2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_banach, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, version=7, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-type=git, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph) Feb 1 04:44:30 localhost podman[292171]: 2026-02-01 09:44:30.048025692 +0000 UTC m=+0.067526496 container remove 2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_banach, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, release=1764794109, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:44:30 localhost systemd[1]: libpod-conmon-2f8ceff2680856a1e13ed8ee19446cb8ed4cc56a59c4b392e78d66674f1255c3.scope: Deactivated successfully. Feb 1 04:44:30 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:44:30 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:44:30 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:30 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:30 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:44:30 localhost podman[292240]: Feb 1 04:44:30 localhost podman[292240]: 2026-02-01 09:44:30.73300984 +0000 UTC m=+0.074408347 container create 954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, RELEASE=main, release=1764794109, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=) Feb 1 04:44:30 localhost systemd[1]: Started libpod-conmon-954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe.scope. Feb 1 04:44:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:44:30 localhost systemd[1]: Started libcrun container. Feb 1 04:44:30 localhost podman[292240]: 2026-02-01 09:44:30.78349494 +0000 UTC m=+0.124893447 container init 954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_curran, release=1764794109, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Feb 1 04:44:30 localhost podman[292240]: 2026-02-01 09:44:30.794910991 +0000 UTC m=+0.136309538 container start 954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_curran, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, release=1764794109, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:44:30 localhost podman[292240]: 2026-02-01 09:44:30.795726576 +0000 UTC m=+0.137125103 container attach 954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_curran, GIT_BRANCH=main, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, RELEASE=main) Feb 1 04:44:30 localhost funny_curran[292255]: 167 167 Feb 1 04:44:30 localhost systemd[1]: libpod-954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe.scope: Deactivated successfully. Feb 1 04:44:30 localhost podman[292240]: 2026-02-01 09:44:30.797965555 +0000 UTC m=+0.139364062 container died 954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_curran, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, distribution-scope=public, name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, io.buildah.version=1.41.4, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:44:30 localhost podman[292240]: 2026-02-01 09:44:30.703281186 +0000 UTC m=+0.044679783 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:44:30 localhost systemd[1]: var-lib-containers-storage-overlay-17abbecd05dbc420f53dadd44369dc90f21d295fa027487429c2365fc42f827d-merged.mount: Deactivated successfully. Feb 1 04:44:30 localhost podman[292257]: 2026-02-01 09:44:30.889434714 +0000 UTC m=+0.118202141 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7) Feb 1 04:44:30 localhost podman[292257]: 2026-02-01 09:44:30.894754968 +0000 UTC m=+0.123522435 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.7, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9) Feb 1 04:44:30 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:44:30 localhost systemd[1]: var-lib-containers-storage-overlay-6ae4fbe8da94ae38b6b8b6f66c6d3f55a3f8170742c5f343fc160f0822cf5c51-merged.mount: Deactivated successfully. Feb 1 04:44:30 localhost podman[292272]: 2026-02-01 09:44:30.946481006 +0000 UTC m=+0.136619727 container remove 954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_curran, GIT_CLEAN=True, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.expose-services=, name=rhceph, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=1764794109) Feb 1 04:44:30 localhost systemd[1]: libpod-conmon-954c174a9c901c448c31cff5d22083a03604e79e4522e360c54f7a4f50b73fbe.scope: Deactivated successfully. Feb 1 04:44:31 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:31 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:44:31 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:44:31 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:31 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain"} : dispatch Feb 1 04:44:31 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain"}]': finished Feb 1 04:44:31 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:31 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:31 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:44:31 localhost openstack_network_exporter[239441]: ERROR 09:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:44:31 localhost openstack_network_exporter[239441]: Feb 1 04:44:31 localhost openstack_network_exporter[239441]: ERROR 09:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:44:31 localhost openstack_network_exporter[239441]: Feb 1 04:44:31 localhost podman[292357]: Feb 1 04:44:31 localhost podman[292357]: 2026-02-01 09:44:31.811025749 +0000 UTC m=+0.079945477 container create 02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_solomon, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph) Feb 1 04:44:31 localhost systemd[1]: Started libpod-conmon-02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246.scope. Feb 1 04:44:31 localhost systemd[1]: Started libcrun container. Feb 1 04:44:31 localhost podman[292357]: 2026-02-01 09:44:31.778272983 +0000 UTC m=+0.047192751 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:44:31 localhost podman[292357]: 2026-02-01 09:44:31.883857106 +0000 UTC m=+0.152776814 container init 02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_solomon, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.openshift.tags=rhceph ceph) Feb 1 04:44:31 localhost systemd[1]: tmp-crun.g6QDCR.mount: Deactivated successfully. Feb 1 04:44:31 localhost podman[292357]: 2026-02-01 09:44:31.90090862 +0000 UTC m=+0.169828288 container start 02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_solomon, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Guillaume Abrioux , name=rhceph, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, ceph=True) Feb 1 04:44:31 localhost podman[292357]: 2026-02-01 09:44:31.901192678 +0000 UTC m=+0.170112406 container attach 02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_solomon, distribution-scope=public, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True) Feb 1 04:44:31 localhost cranky_solomon[292372]: 167 167 Feb 1 04:44:31 localhost systemd[1]: libpod-02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246.scope: Deactivated successfully. Feb 1 04:44:31 localhost podman[292357]: 2026-02-01 09:44:31.904624924 +0000 UTC m=+0.173544692 container died 02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_solomon, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:44:31 localhost nova_compute[274651]: 2026-02-01 09:44:31.992 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:32 localhost podman[292377]: 2026-02-01 09:44:32.004518382 +0000 UTC m=+0.090156240 container remove 02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_solomon, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.41.4, version=7, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, ceph=True, CEPH_POINT_RELEASE=, RELEASE=main) Feb 1 04:44:32 localhost systemd[1]: libpod-conmon-02dfb1aa9d8f07e7c97f192669e2f70919d1a41b4bd3d589ee109726ecdfd246.scope: Deactivated successfully. Feb 1 04:44:32 localhost ceph-mon[286721]: Removed host np0005604210.localdomain Feb 1 04:44:32 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:44:32 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:44:32 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:32 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' Feb 1 04:44:32 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:44:32 localhost ceph-mon[286721]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:44:32 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:44:32 localhost podman[292453]: Feb 1 04:44:32 localhost podman[292453]: 2026-02-01 09:44:32.839626921 +0000 UTC m=+0.076144570 container create ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_sammet, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, release=1764794109, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git) Feb 1 04:44:32 localhost systemd[1]: Started libpod-conmon-ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee.scope. Feb 1 04:44:32 localhost systemd[1]: var-lib-containers-storage-overlay-09423bab7e8e7bb5e93a55ef5b80a7fe50d4196f3da422fbdc16023f172095cf-merged.mount: Deactivated successfully. Feb 1 04:44:32 localhost systemd[1]: Started libcrun container. Feb 1 04:44:32 localhost podman[292453]: 2026-02-01 09:44:32.807028289 +0000 UTC m=+0.043545948 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:44:32 localhost podman[292453]: 2026-02-01 09:44:32.909554618 +0000 UTC m=+0.146072257 container init ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_sammet, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:44:32 localhost podman[292453]: 2026-02-01 09:44:32.921261818 +0000 UTC m=+0.157779477 container start ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_sammet, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., release=1764794109, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, version=7, build-date=2025-12-08T17:28:53Z, distribution-scope=public, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:44:32 localhost podman[292453]: 2026-02-01 09:44:32.921642259 +0000 UTC m=+0.158159938 container attach ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_sammet, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=1764794109) Feb 1 04:44:32 localhost zealous_sammet[292468]: 167 167 Feb 1 04:44:32 localhost systemd[1]: libpod-ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee.scope: Deactivated successfully. Feb 1 04:44:32 localhost podman[292453]: 2026-02-01 09:44:32.926237621 +0000 UTC m=+0.162755320 container died ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_sammet, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True) Feb 1 04:44:33 localhost podman[292473]: 2026-02-01 09:44:33.024010784 +0000 UTC m=+0.088917862 container remove ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_sammet, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main) Feb 1 04:44:33 localhost systemd[1]: libpod-conmon-ee39513534aceaf7d31359b49b170b4fcc301fb2766da34a6d199ee6e0f877ee.scope: Deactivated successfully. Feb 1 04:44:33 localhost systemd[1]: var-lib-containers-storage-overlay-ce8ed5ed31812464ee822ee771da95d4759b81e7172aa1246db3472ae5ff39a7-merged.mount: Deactivated successfully. Feb 1 04:44:36 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:44:36 localhost podman[292488]: 2026-02-01 09:44:36.7346218 +0000 UTC m=+0.091843671 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:44:36 localhost podman[292488]: 2026-02-01 09:44:36.773615477 +0000 UTC m=+0.130837448 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:44:36 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:44:36 localhost nova_compute[274651]: 2026-02-01 09:44:36.995 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:44:36 localhost nova_compute[274651]: 2026-02-01 09:44:36.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:44:36 localhost nova_compute[274651]: 2026-02-01 09:44:36.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:44:36 localhost nova_compute[274651]: 2026-02-01 09:44:36.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:44:37 localhost nova_compute[274651]: 2026-02-01 09:44:37.008 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:37 localhost nova_compute[274651]: 2026-02-01 09:44:37.009 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:44:40 localhost nova_compute[274651]: 2026-02-01 09:44:40.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:40 localhost nova_compute[274651]: 2026-02-01 09:44:40.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:44:40 localhost sshd[292508]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:44:41 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:41 localhost nova_compute[274651]: 2026-02-01 09:44:41.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:44:41.708 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:44:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:44:41.708 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:44:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:44:41.709 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:44:42 localhost nova_compute[274651]: 2026-02-01 09:44:42.009 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:42 localhost nova_compute[274651]: 2026-02-01 09:44:42.011 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:42 localhost nova_compute[274651]: 2026-02-01 09:44:42.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:42 localhost nova_compute[274651]: 2026-02-01 09:44:42.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:42 localhost nova_compute[274651]: 2026-02-01 09:44:42.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.289 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.290 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.290 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.290 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.291 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:44:43 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:44:43 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3430034347' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.743 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.816 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.817 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:44:43 localhost nova_compute[274651]: 2026-02-01 09:44:43.999 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.000 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11572MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.001 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.001 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.124 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.125 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.125 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.257 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.711 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.718 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.742 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.745 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:44:44 localhost nova_compute[274651]: 2026-02-01 09:44:44.745 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:44:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:44:45 localhost podman[292554]: 2026-02-01 09:44:45.734792963 +0000 UTC m=+0.083739643 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:44:45 localhost podman[292554]: 2026-02-01 09:44:45.742662645 +0000 UTC m=+0.091609315 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:44:45 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:44:46 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:46 localhost nova_compute[274651]: 2026-02-01 09:44:46.747 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:46 localhost nova_compute[274651]: 2026-02-01 09:44:46.747 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:44:46 localhost nova_compute[274651]: 2026-02-01 09:44:46.748 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:44:46 localhost nova_compute[274651]: 2026-02-01 09:44:46.940 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:44:46 localhost nova_compute[274651]: 2026-02-01 09:44:46.940 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:44:46 localhost nova_compute[274651]: 2026-02-01 09:44:46.941 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:44:46 localhost nova_compute[274651]: 2026-02-01 09:44:46.941 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.013 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.015 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.015 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.016 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.016 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.019 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.326 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.360 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.361 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:44:47 localhost nova_compute[274651]: 2026-02-01 09:44:47.362 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:48 localhost nova_compute[274651]: 2026-02-01 09:44:48.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:44:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:44:49 localhost podman[292578]: 2026-02-01 09:44:49.729408349 +0000 UTC m=+0.092364318 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:44:49 localhost podman[292578]: 2026-02-01 09:44:49.76457826 +0000 UTC m=+0.127534199 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 04:44:49 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:44:51 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:52 localhost nova_compute[274651]: 2026-02-01 09:44:52.018 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:44:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:44:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:44:53 localhost systemd[1]: tmp-crun.xqKBTt.mount: Deactivated successfully. Feb 1 04:44:53 localhost podman[292597]: 2026-02-01 09:44:53.732731225 +0000 UTC m=+0.083923939 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true) Feb 1 04:44:53 localhost podman[292596]: 2026-02-01 09:44:53.703680183 +0000 UTC m=+0.058793837 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:44:53 localhost podman[292596]: 2026-02-01 09:44:53.787457546 +0000 UTC m=+0.142571190 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:44:53 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:44:53 localhost podman[292597]: 2026-02-01 09:44:53.800702542 +0000 UTC m=+0.151895236 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:44:53 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:44:53 localhost podman[236886]: time="2026-02-01T09:44:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:44:53 localhost podman[236886]: @ - - [01/Feb/2026:09:44:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:44:54 localhost podman[236886]: @ - - [01/Feb/2026:09:44:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18783 "" "Go-http-client/1.1" Feb 1 04:44:56 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:44:57 localhost nova_compute[274651]: 2026-02-01 09:44:57.021 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:01 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:01 localhost openstack_network_exporter[239441]: ERROR 09:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:45:01 localhost openstack_network_exporter[239441]: Feb 1 04:45:01 localhost openstack_network_exporter[239441]: ERROR 09:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:45:01 localhost openstack_network_exporter[239441]: Feb 1 04:45:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:45:01 localhost podman[292645]: 2026-02-01 09:45:01.721566089 +0000 UTC m=+0.082344230 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, version=9.7, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:45:01 localhost podman[292645]: 2026-02-01 09:45:01.737452158 +0000 UTC m=+0.098230299 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1769056855, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:45:01 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:45:02 localhost nova_compute[274651]: 2026-02-01 09:45:02.024 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:02 localhost nova_compute[274651]: 2026-02-01 09:45:02.025 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:02 localhost nova_compute[274651]: 2026-02-01 09:45:02.025 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:02 localhost nova_compute[274651]: 2026-02-01 09:45:02.025 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:02 localhost nova_compute[274651]: 2026-02-01 09:45:02.056 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:02 localhost nova_compute[274651]: 2026-02-01 09:45:02.057 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.527 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.528 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.532 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b189bad-b7c3-4651-8579-6a2fb3a3bebf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.528520', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aee802d6-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': '68232ec885a2f694aebf7db59eb21dc38bd653786b6b6856c5e7dc1eb55e819d'}]}, 'timestamp': '2026-02-01 09:45:03.533048', '_unique_id': 'e3c687b9d13047f594e3b0aa48d61959'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.534 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.535 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.566 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.567 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '079e8b15-d9e8-4704-b6c7-1bce5932e6bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.535705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aeed3a1c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': 'c8c93ed09b7d0c701cdaa19dffa3c2b715df4ecbabe486ed3e9b44da00c03b71'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.535705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aeed4566-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '0a67b8d44e6308bdd89007bc23387efb718a3d1a8722b8527b72a46ac991b4dd'}]}, 'timestamp': '2026-02-01 09:45:03.567301', '_unique_id': '807aabd0de8a4cfcbdd70ab4b43c71f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.568 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c7535a9-096a-416d-b46c-a9737f5af96a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.568759', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aeed86fc-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': 'cef9179df9b08988977b9b7c1195c7577f6f669c6b9c4857631c75f2863f8ba8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.568759', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aeed8f8a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '76ec5aaeac0f8f672c2134b3e940d45282569f635241633c91a00f4708ceb2a5'}]}, 'timestamp': '2026-02-01 09:45:03.569189', '_unique_id': 'c28fa4e039694ebea3d7d8746b30934c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9827eb09-740a-4a52-a3cb-c4da281f6d08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.570231', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aeedc0e0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': 'e1bee1e103d2fb47a85d19e62203bb29a1a1dfa037637dd30dbf6d0df4cc8b7c'}]}, 'timestamp': '2026-02-01 09:45:03.570497', '_unique_id': 'c66d965f652841b2b5633eaa15c229c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.570 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.571 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.571 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.587 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ce9548d-eeaf-4b58-9263-8c48fe4be539', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:45:03.571635', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'aef0530a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.806408582, 'message_signature': '455e264ff47306e94903aad6fe2a02fccc7a944e8ca33503e5a50193df4d3cdf'}]}, 'timestamp': '2026-02-01 09:45:03.587315', '_unique_id': '3de2e71b0d7e477c924944d0c7fa39e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.588 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5529f084-22f2-4427-a528-5fa35b12d64e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.588614', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef08e92-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': '6528e69ae6c59a85a6649c37195fb1512d0b4e0b037b315bf63150bd3554bc25'}]}, 'timestamp': '2026-02-01 09:45:03.588836', '_unique_id': '4f9522a9425f4a9ead342b0ec1665264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2dc49a9-8706-40e8-b3f6-0b0879233657', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.589931', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef0c362-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': 'bf1594de1a8ba006fec6656592d71f959e053a345427f476884e5a683602e607'}]}, 'timestamp': '2026-02-01 09:45:03.590223', '_unique_id': 'ff92206379b94538a6cc39355ea632b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.600 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.601 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1c9f74b-ebdf-4509-8feb-738192f5f6c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.591486', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aef27392-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.810891699, 'message_signature': '48948c46de4904dd7d3da6697f58d0c754514af10021381e9478950e404c1837'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.591486', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aef28012-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.810891699, 'message_signature': '6f1adff9097aaff7ec6a5b0eba3b05d775fc451ccf64367d865cb230c388807f'}]}, 'timestamp': '2026-02-01 09:45:03.601609', '_unique_id': '578f8ee9d45c46548ac902b3872f38e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.603 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fecf5c75-a234-4b5d-a777-e018245715cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.603386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aef2d1c0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': 'e3242dfa5ce98994db8280704061218ef6da73ffee69bb14c75042d8b754b8db'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.603386', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aef2dd00-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '672caa10016e90ddcd36cd0bfcbfe0531ac74659db755ed96b2ae04f80fa1731'}]}, 'timestamp': '2026-02-01 09:45:03.603981', '_unique_id': 'bc952d0ce7e3409abdbd0a3e64b5ac47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.605 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.605 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb4ab011-c2ed-4c05-a356-a5c9d8e5ef09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.605565', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef326de-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': 'e123e4a47220d77620de0afc7344cc3b7bb3831f74a47106900f6c76e0942bfc'}]}, 'timestamp': '2026-02-01 09:45:03.605896', '_unique_id': '7f70e4cadd9249948602881ab0e3c1b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.607 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13b962f9-70fa-4570-be87-6d33183594bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.607381', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef36da6-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': '366b707aef833d53c9ae53f3a5b3a369cc55657b6c3836ec0308ddab9c5ca7ba'}]}, 'timestamp': '2026-02-01 09:45:03.607705', '_unique_id': '0e9934e1a9a647288026b31c270b242d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.609 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39e9eeb5-f669-4b58-b98e-dac206ffdda6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.609251', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef3b798-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': '270ac15a87c03d8b199794ea79f8149e6964343465b6f3d716f349dcc2c90764'}]}, 'timestamp': '2026-02-01 09:45:03.609600', '_unique_id': 'd64cde32b18947c5b4177c7ed2722700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.610 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.611 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 12420000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a5a750e-845a-4ead-82a9-d812bf0c39b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12420000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:45:03.611052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'aef3fca8-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.806408582, 'message_signature': 'e5c56222a7287a60130632be710d6e82ce0117cd85e4006e888ae1ca562c680c'}]}, 'timestamp': '2026-02-01 09:45:03.611547', '_unique_id': '1daffcf12594446ba704ee708be13de4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.612 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efba1bec-dbe5-4b78-be3d-199f43be166f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.613061', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef44b40-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': '07d739d48333bc45c0aba2dda120b2403e3357a077056a8ef38de5cd23078fa3'}]}, 'timestamp': '2026-02-01 09:45:03.613378', '_unique_id': '8f80dd5da73647f0bcbaed352b502a79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.614 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.615 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2ade928-59a5-4fd1-8eac-b98f0219091f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.614854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aef4949c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.810891699, 'message_signature': '177d094a83c1590edfbc5a99ee5c46b077ffba723c6743640a69c726c1225bf6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.614854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aef4a464-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.810891699, 'message_signature': '7281e0231ef0496287fe7f0b09567e7fdcb87b45e1d94be4d058decd49622186'}]}, 'timestamp': '2026-02-01 09:45:03.615685', '_unique_id': '0eeac217cf114cf2be281133eb6dfaff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc348c81-41c2-42c5-9c26-0be06deaa48f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.617423', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef4f5cc-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': 'e9d00cb0b2e0030e8969335fb75cdd695e87388f039352b8be92a558ded7113c'}]}, 'timestamp': '2026-02-01 09:45:03.617746', '_unique_id': '5800df0a6f1d4ec0b265d18cfe37c47d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.619 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.619 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26f7514b-e659-4907-91d4-1a6a7b5d34fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:45:03.619245', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'aef53d70-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.747955436, 'message_signature': '80dd096c97d54e2c85dfcfabad75baaa4d1fdc9093ecaed0b74433234d537a88'}]}, 'timestamp': '2026-02-01 09:45:03.619578', '_unique_id': '914619ae16ed486793805e82b5f18877'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a2cc020-c608-4e2b-af9c-740cf9fe86f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.621026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aef58244-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.810891699, 'message_signature': '5a4a15b528c35ef887145a3bcb987e5e6d94bf869206bcef25e9e1e208c0b3ec'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.621026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aef58dfc-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.810891699, 'message_signature': 'a2987c78b9bbd903b8224afaddef46def0cc9be04463078e1cbb3ddb9b985cf5'}]}, 'timestamp': '2026-02-01 09:45:03.621618', '_unique_id': '3cb58d53af1c45f781b8005bfceeb5b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.622 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.623 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.623 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.623 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4bcd0c1-f1b8-47f3-b6da-cdf13a3e1c1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.623153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aef5d690-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '2e8f29f128943219b5a0164a9fdee0c19867a8caa42d362f608e396aea552305'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.623153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aef5e61c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '569dd6ca1313e255995d5050535deaef8e1c3a3b1b5eb4f8510c212d5b0d55e7'}]}, 'timestamp': '2026-02-01 09:45:03.623940', '_unique_id': 'c061a7e71888408e8465254eacb7a220'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.625 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc784d37-9d99-4e93-a696-253ce344cf51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.626058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aef64b34-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '1af53c64a185c82154699b8375ca4f321b54b5d9b3c7565ce0603399d18b1ff0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.626058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aef65bec-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': 'ab8d07057488ff15ffb61b3761d5c631601cbf713ec7579d8fa324c6662ed586'}]}, 'timestamp': '2026-02-01 09:45:03.626958', '_unique_id': '9d0cde734ddd4cf5aff05e752687f8c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.628 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.628 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27b8af9e-dcee-48f6-a32f-ee3298e37f7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:45:03.628778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'aef6b164-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '59421e69cd0d739f799e76c1ed3f4e47c68e12031019888ddf5a9eeed3575a23'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:45:03.628778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'aef6bd8a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11197.755145526, 'message_signature': '9d75afbfd891f61e9d29b735296d6876de2e8291f98d033a6e212e91b2c88516'}]}, 'timestamp': '2026-02-01 09:45:03.629433', '_unique_id': '1ed040de1a0b43bbb08c6b7b26ec009f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 04:45:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:45:03.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:45:06 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:07 localhost nova_compute[274651]: 2026-02-01 09:45:07.058 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:07 localhost nova_compute[274651]: 2026-02-01 09:45:07.060 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:07 localhost nova_compute[274651]: 2026-02-01 09:45:07.060 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:07 localhost nova_compute[274651]: 2026-02-01 09:45:07.060 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:07 localhost nova_compute[274651]: 2026-02-01 09:45:07.082 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:07 localhost nova_compute[274651]: 2026-02-01 09:45:07.083 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:07 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e86 e86: 6 total, 6 up, 6 in Feb 1 04:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr handle_mgr_map Activating! Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr handle_mgr_map I am now activating Feb 1 04:45:07 localhost ceph-mon[286721]: Activating manager daemon np0005604212.oynhpm Feb 1 04:45:07 localhost ceph-mon[286721]: Manager daemon np0005604210.rirrtk is unresponsive, replacing it with standby daemon np0005604212.oynhpm Feb 1 04:45:07 localhost ceph-mgr[278591]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: balancer Feb 1 04:45:07 localhost ceph-mgr[278591]: [balancer INFO root] Starting Feb 1 04:45:07 localhost ceph-mgr[278591]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: [balancer INFO root] Optimize plan auto_2026-02-01_09:45:07 Feb 1 04:45:07 localhost ceph-mgr[278591]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Feb 1 04:45:07 localhost ceph-mgr[278591]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Feb 1 04:45:07 localhost systemd[1]: tmp-crun.aQaXAB.mount: Deactivated successfully. Feb 1 04:45:07 localhost podman[292665]: 2026-02-01 09:45:07.355262648 +0000 UTC m=+0.083792345 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:45:07 localhost ceph-mgr[278591]: [cephadm WARNING root] removing stray HostCache host record np0005604210.localdomain.devices.0 Feb 1 04:45:07 localhost ceph-mgr[278591]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005604210.localdomain.devices.0 Feb 1 04:45:07 localhost podman[292665]: 2026-02-01 09:45:07.367359369 +0000 UTC m=+0.095888986 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 1 04:45:07 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: cephadm Feb 1 04:45:07 localhost ceph-mgr[278591]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: crash Feb 1 04:45:07 localhost ceph-mgr[278591]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: devicehealth Feb 1 04:45:07 localhost ceph-mgr[278591]: [devicehealth INFO root] Starting Feb 1 04:45:07 localhost ceph-mgr[278591]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: iostat Feb 1 04:45:07 localhost ceph-mgr[278591]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: nfs Feb 1 04:45:07 localhost ceph-mgr[278591]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: orchestrator Feb 1 04:45:07 localhost ceph-mgr[278591]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: pg_autoscaler Feb 1 04:45:07 localhost ceph-mgr[278591]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: progress Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: [progress INFO root] Loading... Feb 1 04:45:07 localhost ceph-mgr[278591]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Feb 1 04:45:07 localhost ceph-mgr[278591]: [progress INFO root] Loaded OSDMap, ready. Feb 1 04:45:07 localhost ceph-mgr[278591]: [pg_autoscaler INFO root] _maybe_adjust Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] recovery thread starting Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] starting setup Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: rbd_support Feb 1 04:45:07 localhost ceph-mgr[278591]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: restful Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [restful INFO root] server_addr: :: server_port: 8003 Feb 1 04:45:07 localhost ceph-mgr[278591]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: status Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: telemetry Feb 1 04:45:07 localhost ceph-mgr[278591]: [restful WARNING root] server not running: no certificate configured Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: images, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] PerfHandler: starting Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_task_task: vms, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_task_task: volumes, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_task_task: images, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Feb 1 04:45:07 localhost ceph-mgr[278591]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Feb 1 04:45:07 localhost ceph-mgr[278591]: mgr load Constructed class from module: volumes Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_task_task: backups, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] TaskHandler: starting Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.493+0000 7f916865e640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.493+0000 7f916865e640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.493+0000 7f916865e640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.493+0000 7f916865e640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: images, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.493+0000 7f916865e640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.494+0000 7f9165658640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.494+0000 7f9165658640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.494+0000 7f9165658640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.494+0000 7f9165658640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:07.494+0000 7f9165658640 -1 client.0 error registering admin socket command: (17) File exists Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Feb 1 04:45:07 localhost ceph-mgr[278591]: [rbd_support INFO root] setup complete Feb 1 04:45:07 localhost sshd[292821]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:45:07 localhost systemd-logind[759]: New session 70 of user ceph-admin. Feb 1 04:45:07 localhost systemd[1]: Started Session 70 of User ceph-admin. Feb 1 04:45:08 localhost ceph-mon[286721]: Manager daemon np0005604212.oynhpm is now available Feb 1 04:45:08 localhost ceph-mon[286721]: removing stray HostCache host record np0005604210.localdomain.devices.0 Feb 1 04:45:08 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"} : dispatch Feb 1 04:45:08 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"}]': finished Feb 1 04:45:08 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"} : dispatch Feb 1 04:45:08 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"}]': finished Feb 1 04:45:08 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604212.oynhpm/mirror_snapshot_schedule"} : dispatch Feb 1 04:45:08 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604212.oynhpm/trash_purge_schedule"} : dispatch Feb 1 04:45:08 localhost ceph-mgr[278591]: log_channel(audit) log [DBG] : from='client.34409 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Feb 1 04:45:08 localhost ceph-mgr[278591]: [cephadm INFO root] Saving service mon spec with placement label:mon Feb 1 04:45:08 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Feb 1 04:45:08 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.533137) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108533186, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2376, "num_deletes": 257, "total_data_size": 8549790, "memory_usage": 9439264, "flush_reason": "Manual Compaction"} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108557852, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5261598, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12397, "largest_seqno": 14768, "table_properties": {"data_size": 5252225, "index_size": 5558, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 24695, "raw_average_key_size": 22, "raw_value_size": 5231466, "raw_average_value_size": 4760, "num_data_blocks": 238, "num_entries": 1099, "num_filter_entries": 1099, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939002, "oldest_key_time": 1769939002, "file_creation_time": 1769939108, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 24780 microseconds, and 8223 cpu microseconds. Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.557914) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5261598 bytes OK Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.557945) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.559918) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.559947) EVENT_LOG_v1 {"time_micros": 1769939108559940, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.559975) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8538265, prev total WAL file size 8538265, number of live WAL files 2. Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.563040) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(5138KB)], [18(16MB)] Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108563104, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 22703357, "oldest_snapshot_seqno": -1} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10517 keys, 19394663 bytes, temperature: kUnknown Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108680334, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 19394663, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19331459, "index_size": 35839, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26309, "raw_key_size": 280607, "raw_average_key_size": 26, "raw_value_size": 19148699, "raw_average_value_size": 1820, "num_data_blocks": 1377, "num_entries": 10517, "num_filter_entries": 10517, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939108, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.680592) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 19394663 bytes Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.682142) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.6 rd, 165.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.0, 16.6 +0.0 blob) out(18.5 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 11068, records dropped: 551 output_compression: NoCompression Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.682163) EVENT_LOG_v1 {"time_micros": 1769939108682154, "job": 8, "event": "compaction_finished", "compaction_time_micros": 117299, "compaction_time_cpu_micros": 54185, "output_level": 6, "num_output_files": 1, "total_output_size": 19394663, "num_input_records": 11068, "num_output_records": 10517, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108682837, "job": 8, "event": "table_file_deletion", "file_number": 20} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108684940, "job": 8, "event": "table_file_deletion", "file_number": 18} Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.561904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.685354) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.685363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.685367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.685372) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:08 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:08.685376) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:08 localhost ceph-mgr[278591]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:08] ENGINE Bus STARTING Feb 1 04:45:08 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:08] ENGINE Bus STARTING Feb 1 04:45:08 localhost ceph-mgr[278591]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:08] ENGINE Serving on https://172.18.0.106:7150 Feb 1 04:45:08 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:08] ENGINE Serving on https://172.18.0.106:7150 Feb 1 04:45:08 localhost ceph-mgr[278591]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:08] ENGINE Client ('172.18.0.106', 47908) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:45:08 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:08] ENGINE Client ('172.18.0.106', 47908) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:45:09 localhost ceph-mgr[278591]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:09] ENGINE Serving on http://172.18.0.106:8765 Feb 1 04:45:09 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:09] ENGINE Serving on http://172.18.0.106:8765 Feb 1 04:45:09 localhost ceph-mgr[278591]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:09] ENGINE Bus STARTED Feb 1 04:45:09 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:09] ENGINE Bus STARTED Feb 1 04:45:09 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: Saving service mon spec with placement label:mon Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:09 localhost ceph-mon[286721]: [01/Feb/2026:09:45:08] ENGINE Bus STARTING Feb 1 04:45:09 localhost podman[293011]: 2026-02-01 09:45:09.402075672 +0000 UTC m=+0.093405460 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:45:09 localhost podman[293011]: 2026-02-01 09:45:09.554748461 +0000 UTC m=+0.246078219 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, architecture=x86_64, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True) Feb 1 04:45:09 localhost ceph-mgr[278591]: [devicehealth INFO root] Check health Feb 1 04:45:10 localhost ceph-mgr[278591]: log_channel(audit) log [DBG] : from='client.34530 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604213", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 1 04:45:10 localhost ceph-mon[286721]: [01/Feb/2026:09:45:08] ENGINE Serving on https://172.18.0.106:7150 Feb 1 04:45:10 localhost ceph-mon[286721]: [01/Feb/2026:09:45:08] ENGINE Client ('172.18.0.106', 47908) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:45:10 localhost ceph-mon[286721]: [01/Feb/2026:09:45:09] ENGINE Serving on http://172.18.0.106:8765 Feb 1 04:45:10 localhost ceph-mon[286721]: [01/Feb/2026:09:45:09] ENGINE Bus STARTED Feb 1 04:45:10 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:45:10 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:45:10 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:10 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:11 localhost ceph-mon[286721]: mon.np0005604212@3(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:11 localhost ceph-mgr[278591]: [cephadm INFO root] Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:45:11 localhost ceph-mgr[278591]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(audit) log [DBG] : from='client.34445 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005604213"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Feb 1 04:45:11 localhost ceph-mgr[278591]: [cephadm INFO root] Remove daemons mon.np0005604213 Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005604213 Feb 1 04:45:11 localhost ceph-mgr[278591]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005604213: new quorum should be ['np0005604211', 'np0005604215', 'np0005604212'] (from ['np0005604211', 'np0005604215', 'np0005604212']) Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005604213: new quorum should be ['np0005604211', 'np0005604215', 'np0005604212'] (from ['np0005604211', 'np0005604215', 'np0005604212']) Feb 1 04:45:11 localhost ceph-mgr[278591]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005604213 from monmap... Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Removing monitor np0005604213 from monmap... Feb 1 04:45:11 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005604213 from np0005604213.localdomain -- ports [] Feb 1 04:45:11 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005604213 from np0005604213.localdomain -- ports [] Feb 1 04:45:11 localhost ceph-mon[286721]: mon.np0005604212@3(peon) e11 my rank is now 2 (was 3) Feb 1 04:45:11 localhost ceph-mgr[278591]: client.44431 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 1 04:45:11 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:45:11 localhost ceph-mon[286721]: paxos.2).electionLogic(46) init, last seen epoch 46 Feb 1 04:45:11 localhost ceph-mon[286721]: mon.np0005604212@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:12 localhost nova_compute[274651]: 2026-02-01 09:45:12.084 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:12 localhost nova_compute[274651]: 2026-02-01 09:45:12.086 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:12 localhost nova_compute[274651]: 2026-02-01 09:45:12.087 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:12 localhost nova_compute[274651]: 2026-02-01 09:45:12.087 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:12 localhost nova_compute[274651]: 2026-02-01 09:45:12.088 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:12 localhost nova_compute[274651]: 2026-02-01 09:45:12.091 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:13 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:15 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s Feb 1 04:45:16 localhost ceph-mon[286721]: paxos.2).electionLogic(47) init, last seen epoch 47, mid-election, bumping Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604212@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604212@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:45:16 localhost ceph-mgr[278591]: [cephadm INFO root] Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:45:16 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:45:16 localhost ceph-mgr[278591]: [cephadm INFO root] Adjusting osd_memory_target on np0005604213.localdomain to 1348M Feb 1 04:45:16 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604213.localdomain to 1348M Feb 1 04:45:16 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:45:16 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:45:16 localhost ceph-mon[286721]: Remove daemons mon.np0005604213 Feb 1 04:45:16 localhost ceph-mon[286721]: Safe to remove mon.np0005604213: new quorum should be ['np0005604211', 'np0005604215', 'np0005604212'] (from ['np0005604211', 'np0005604215', 'np0005604212']) Feb 1 04:45:16 localhost ceph-mon[286721]: Removing monitor np0005604213 from monmap... Feb 1 04:45:16 localhost ceph-mon[286721]: Removing daemon mon.np0005604213 from np0005604213.localdomain -- ports [] Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604215 in quorum (ranks 0,1) Feb 1 04:45:16 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:45:16 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604212 in quorum (ranks 0,1,2) Feb 1 04:45:16 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:45:16 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:16 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:16 localhost ceph-mgr[278591]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:45:16 localhost ceph-mgr[278591]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:45:16 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:16 localhost podman[293263]: 2026-02-01 09:45:16.739857989 +0000 UTC m=+0.094134093 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:45:16 localhost podman[293263]: 2026-02-01 09:45:16.777420642 +0000 UTC m=+0.131696686 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:45:16 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:45:17 localhost nova_compute[274651]: 2026-02-01 09:45:17.087 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:17 localhost nova_compute[274651]: 2026-02-01 09:45:17.092 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:17 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s Feb 1 04:45:17 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:17 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:45:17 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 1348M Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:45:17 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:17 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:17 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:17 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:17 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:45:17 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:45:17 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:17 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:18 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:18 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:18 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:18 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:18 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Feb 1 04:45:19 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s Feb 1 04:45:19 localhost ceph-mgr[278591]: [progress INFO root] update: starting ev e19f5c2d-06a2-4f17-a92f-78db554dc086 (Updating node-proxy deployment (+4 -> 4)) Feb 1 04:45:19 localhost ceph-mgr[278591]: [progress INFO root] complete: finished ev e19f5c2d-06a2-4f17-a92f-78db554dc086 (Updating node-proxy deployment (+4 -> 4)) Feb 1 04:45:19 localhost ceph-mgr[278591]: [progress INFO root] Completed event e19f5c2d-06a2-4f17-a92f-78db554dc086 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:19 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:45:19 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:45:19 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:45:19 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:45:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:45:20 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:45:20 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:45:20 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:45:20 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:45:20 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:45:20 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:45:20 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:45:20 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:45:20 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:45:20 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:20 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:20 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:45:20 localhost podman[293943]: 2026-02-01 09:45:20.735108796 +0000 UTC m=+0.096778794 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 04:45:20 localhost podman[293943]: 2026-02-01 09:45:20.771903696 +0000 UTC m=+0.133573714 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 04:45:20 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:45:21 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:21 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s Feb 1 04:45:21 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:45:21 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:45:21 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:45:21 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:45:21 localhost ceph-mon[286721]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:45:21 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:45:21 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:21 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:21 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:45:22 localhost nova_compute[274651]: 2026-02-01 09:45:22.093 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:22 localhost nova_compute[274651]: 2026-02-01 09:45:22.137 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:22 localhost nova_compute[274651]: 2026-02-01 09:45:22.138 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:22 localhost nova_compute[274651]: 2026-02-01 09:45:22.138 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:22 localhost nova_compute[274651]: 2026-02-01 09:45:22.139 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:22 localhost nova_compute[274651]: 2026-02-01 09:45:22.139 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:22 localhost podman[294015]: Feb 1 04:45:22 localhost podman[294015]: 2026-02-01 09:45:22.180769837 +0000 UTC m=+0.122264927 container create de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_hawking, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.41.4, RELEASE=main, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1764794109, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:45:22 localhost systemd[1]: Started libpod-conmon-de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808.scope. Feb 1 04:45:22 localhost systemd[1]: Started libcrun container. Feb 1 04:45:22 localhost podman[294015]: 2026-02-01 09:45:22.242287496 +0000 UTC m=+0.183782646 container init de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_hawking, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, distribution-scope=public, name=rhceph, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7) Feb 1 04:45:22 localhost podman[294015]: 2026-02-01 09:45:22.152654253 +0000 UTC m=+0.094149343 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:45:22 localhost podman[294015]: 2026-02-01 09:45:22.256809062 +0000 UTC m=+0.198304152 container start de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_hawking, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=1764794109, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:45:22 localhost podman[294015]: 2026-02-01 09:45:22.257126472 +0000 UTC m=+0.198621532 container attach de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_hawking, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, CEPH_POINT_RELEASE=) Feb 1 04:45:22 localhost systemd[1]: libpod-de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808.scope: Deactivated successfully. Feb 1 04:45:22 localhost determined_hawking[294030]: 167 167 Feb 1 04:45:22 localhost podman[294015]: 2026-02-01 09:45:22.262315281 +0000 UTC m=+0.203810351 container died de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_hawking, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1764794109, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:45:22 localhost podman[294035]: 2026-02-01 09:45:22.339549274 +0000 UTC m=+0.066973548 container remove de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_hawking, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=1764794109, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux , version=7, ceph=True, name=rhceph, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc.) Feb 1 04:45:22 localhost systemd[1]: libpod-conmon-de0e7e74207dbdd2dc08bef24f4d0a65f4580af86d608f6559fbbe678c654808.scope: Deactivated successfully. Feb 1 04:45:22 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Feb 1 04:45:22 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Feb 1 04:45:22 localhost ceph-mgr[278591]: [progress INFO root] Writing back 50 completed events Feb 1 04:45:22 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:45:22 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:45:22 localhost sshd[294088]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:45:22 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:45:22 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:45:22 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:22 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:22 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:45:22 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:23 localhost podman[294108]: Feb 1 04:45:23 localhost podman[294108]: 2026-02-01 09:45:23.082715539 +0000 UTC m=+0.087378335 container create c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_ardinghelli, ceph=True, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.buildah.version=1.41.4, version=7, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph) Feb 1 04:45:23 localhost systemd[1]: Started libpod-conmon-c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd.scope. Feb 1 04:45:23 localhost systemd[1]: Started libcrun container. Feb 1 04:45:23 localhost podman[294108]: 2026-02-01 09:45:23.044265808 +0000 UTC m=+0.048928624 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:45:23 localhost podman[294108]: 2026-02-01 09:45:23.151601375 +0000 UTC m=+0.156264161 container init c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_ardinghelli, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109) Feb 1 04:45:23 localhost podman[294108]: 2026-02-01 09:45:23.160785037 +0000 UTC m=+0.165447833 container start c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_ardinghelli, io.openshift.expose-services=, version=7, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:45:23 localhost podman[294108]: 2026-02-01 09:45:23.161139057 +0000 UTC m=+0.165801883 container attach c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_ardinghelli, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=) Feb 1 04:45:23 localhost competent_ardinghelli[294123]: 167 167 Feb 1 04:45:23 localhost systemd[1]: libpod-c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd.scope: Deactivated successfully. Feb 1 04:45:23 localhost podman[294108]: 2026-02-01 09:45:23.164224202 +0000 UTC m=+0.168887028 container died c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_ardinghelli, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.expose-services=, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container) Feb 1 04:45:23 localhost systemd[1]: var-lib-containers-storage-overlay-575c19403ea097721c8c82f55e7081ca3cbb7bbc0a7aea3c5d1c0268892418d0-merged.mount: Deactivated successfully. Feb 1 04:45:23 localhost systemd[1]: var-lib-containers-storage-overlay-ef3e32da85ed2e7b2fb6133eec66e6d0b7751f03e6def71b50107479a1a85bf0-merged.mount: Deactivated successfully. Feb 1 04:45:23 localhost podman[294128]: 2026-02-01 09:45:23.271432994 +0000 UTC m=+0.092967076 container remove c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_ardinghelli, io.openshift.expose-services=, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, release=1764794109, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:45:23 localhost systemd[1]: libpod-conmon-c8dbb3b0a6cda437c0e800d90914644287f0a8753f230a0df0b8648f171329cd.scope: Deactivated successfully. Feb 1 04:45:23 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s Feb 1 04:45:23 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Feb 1 04:45:23 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Feb 1 04:45:23 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:45:23 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:45:23 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:45:23 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:45:23 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:23 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:23 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:45:23 localhost podman[236886]: time="2026-02-01T09:45:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:45:23 localhost podman[236886]: @ - - [01/Feb/2026:09:45:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:45:24 localhost podman[236886]: @ - - [01/Feb/2026:09:45:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18794 "" "Go-http-client/1.1" Feb 1 04:45:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:45:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:45:24 localhost podman[294204]: 2026-02-01 09:45:24.197026073 +0000 UTC m=+0.096275478 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:45:24 localhost podman[294203]: 2026-02-01 09:45:24.246860144 +0000 UTC m=+0.148190013 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:45:24 localhost podman[294217]: Feb 1 04:45:24 localhost podman[294217]: 2026-02-01 09:45:24.273374428 +0000 UTC m=+0.143538350 container create 1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_hawking, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, RELEASE=main, GIT_CLEAN=True) Feb 1 04:45:24 localhost podman[294203]: 2026-02-01 09:45:24.281624481 +0000 UTC m=+0.182954330 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:45:24 localhost podman[294204]: 2026-02-01 09:45:24.298171019 +0000 UTC m=+0.197420444 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 1 04:45:24 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:45:24 localhost systemd[1]: Started libpod-conmon-1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b.scope. Feb 1 04:45:24 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:45:24 localhost systemd[1]: Started libcrun container. Feb 1 04:45:24 localhost podman[294217]: 2026-02-01 09:45:24.237129164 +0000 UTC m=+0.107293156 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:45:24 localhost podman[294217]: 2026-02-01 09:45:24.348468304 +0000 UTC m=+0.218632236 container init 1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_hawking, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-type=git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1764794109) Feb 1 04:45:24 localhost podman[294217]: 2026-02-01 09:45:24.357034137 +0000 UTC m=+0.227198069 container start 1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_hawking, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1764794109, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph) Feb 1 04:45:24 localhost podman[294217]: 2026-02-01 09:45:24.357297425 +0000 UTC m=+0.227461367 container attach 1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_hawking, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=1764794109, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, ceph=True, architecture=x86_64, name=rhceph, vcs-type=git) Feb 1 04:45:24 localhost systemd[1]: libpod-1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b.scope: Deactivated successfully. Feb 1 04:45:24 localhost competent_hawking[294269]: 167 167 Feb 1 04:45:24 localhost podman[294217]: 2026-02-01 09:45:24.36491431 +0000 UTC m=+0.235078212 container died 1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_hawking, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7) Feb 1 04:45:24 localhost podman[294274]: 2026-02-01 09:45:24.467677415 +0000 UTC m=+0.089638184 container remove 1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_hawking, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:45:24 localhost systemd[1]: libpod-conmon-1d73c410ffefe7b0cf79e53e1c847f61ebe0849d160a5de5016dc1e7f59c434b.scope: Deactivated successfully. Feb 1 04:45:24 localhost ceph-mgr[278591]: log_channel(audit) log [DBG] : from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005604213.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch Feb 1 04:45:24 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:45:24 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:45:24 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:45:24 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:45:24 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:45:24 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:45:24 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:45:24 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:45:24 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:24 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:45:24 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:24 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:24 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:45:25 localhost systemd[1]: var-lib-containers-storage-overlay-f62d0f64d8b23d1b364e2b9a65e7364054c4b6ea8cad2c73fbde2ed25d7c06a6-merged.mount: Deactivated successfully. Feb 1 04:45:25 localhost podman[294349]: Feb 1 04:45:25 localhost podman[294349]: 2026-02-01 09:45:25.352733648 +0000 UTC m=+0.086485757 container create 27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kowalevski, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.41.4, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public) Feb 1 04:45:25 localhost systemd[1]: Started libpod-conmon-27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89.scope. Feb 1 04:45:25 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:25 localhost podman[294349]: 2026-02-01 09:45:25.319717105 +0000 UTC m=+0.053469244 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:45:25 localhost systemd[1]: Started libcrun container. Feb 1 04:45:25 localhost podman[294349]: 2026-02-01 09:45:25.441946228 +0000 UTC m=+0.175698347 container init 27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kowalevski, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, release=1764794109, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Feb 1 04:45:25 localhost podman[294349]: 2026-02-01 09:45:25.453923926 +0000 UTC m=+0.187676045 container start 27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kowalevski, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.41.4, vcs-type=git, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z) Feb 1 04:45:25 localhost podman[294349]: 2026-02-01 09:45:25.454306048 +0000 UTC m=+0.188058197 container attach 27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kowalevski, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Feb 1 04:45:25 localhost modest_kowalevski[294364]: 167 167 Feb 1 04:45:25 localhost systemd[1]: libpod-27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89.scope: Deactivated successfully. Feb 1 04:45:25 localhost podman[294349]: 2026-02-01 09:45:25.459147386 +0000 UTC m=+0.192899515 container died 27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kowalevski, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, version=7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, io.openshift.tags=rhceph ceph) Feb 1 04:45:25 localhost podman[294369]: 2026-02-01 09:45:25.566887566 +0000 UTC m=+0.093333758 container remove 27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_kowalevski, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , name=rhceph, distribution-scope=public, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, GIT_CLEAN=True) Feb 1 04:45:25 localhost systemd[1]: libpod-conmon-27e90b5184288624f6b1831cd384670f2dd69055c049f200421a0ce2e9b1cb89.scope: Deactivated successfully. Feb 1 04:45:25 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:45:25 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:45:25 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:45:25 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:45:25 localhost ceph-mon[286721]: Deploying daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:45:25 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:45:25 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:45:25 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:25 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:25 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:45:26 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:26 localhost systemd[1]: var-lib-containers-storage-overlay-cf9a6c8e9b3f6ec5b602bfbcc1abf2c2196b2da3a519da7a3b3dd8da78a5c6a5-merged.mount: Deactivated successfully. Feb 1 04:45:26 localhost podman[294438]: Feb 1 04:45:26 localhost podman[294438]: 2026-02-01 09:45:26.303113948 +0000 UTC m=+0.063871663 container create 8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_swartz, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git) Feb 1 04:45:26 localhost systemd[1]: Started libpod-conmon-8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce.scope. Feb 1 04:45:26 localhost systemd[1]: Started libcrun container. Feb 1 04:45:26 localhost podman[294438]: 2026-02-01 09:45:26.277263003 +0000 UTC m=+0.038020708 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:45:26 localhost podman[294438]: 2026-02-01 09:45:26.377208633 +0000 UTC m=+0.137966348 container init 8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_swartz, maintainer=Guillaume Abrioux , architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-12-08T17:28:53Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, ceph=True, io.openshift.tags=rhceph ceph) Feb 1 04:45:26 localhost podman[294438]: 2026-02-01 09:45:26.387936472 +0000 UTC m=+0.148694187 container start 8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_swartz, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, com.redhat.component=rhceph-container, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:45:26 localhost podman[294438]: 2026-02-01 09:45:26.388241661 +0000 UTC m=+0.148999396 container attach 8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_swartz, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, ceph=True, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z) Feb 1 04:45:26 localhost cool_swartz[294453]: 167 167 Feb 1 04:45:26 localhost systemd[1]: libpod-8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce.scope: Deactivated successfully. Feb 1 04:45:26 localhost podman[294438]: 2026-02-01 09:45:26.393954027 +0000 UTC m=+0.154711762 container died 8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_swartz, vcs-type=git, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:45:26 localhost podman[294458]: 2026-02-01 09:45:26.497699113 +0000 UTC m=+0.095544535 container remove 8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_swartz, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Feb 1 04:45:26 localhost systemd[1]: libpod-conmon-8845a93f49f65e4f841704a1c9a365a60755ac7f4609c4e79e91864aaa91adce.scope: Deactivated successfully. Feb 1 04:45:26 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:45:26 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:45:26 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:45:26 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:45:26 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:45:26 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:45:26 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:26 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:26 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:45:27 localhost nova_compute[274651]: 2026-02-01 09:45:27.140 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:27 localhost nova_compute[274651]: 2026-02-01 09:45:27.143 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:27 localhost nova_compute[274651]: 2026-02-01 09:45:27.143 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:27 localhost nova_compute[274651]: 2026-02-01 09:45:27.144 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:27 localhost nova_compute[274651]: 2026-02-01 09:45:27.193 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:27 localhost nova_compute[274651]: 2026-02-01 09:45:27.194 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:27 localhost systemd[1]: var-lib-containers-storage-overlay-1c9d7098fddae1324338f1762624b95538b320010767a22297a52d4f905a332c-merged.mount: Deactivated successfully. Feb 1 04:45:27 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:27 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:45:27 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:45:27 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:27 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:27 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Feb 1 04:45:27 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Feb 1 04:45:27 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:45:27 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:45:28 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:28 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:28 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:28 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:45:28 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:28 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:45:28 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:45:29 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Feb 1 04:45:29 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Feb 1 04:45:29 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:45:29 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:45:29 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:29 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:29 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:29 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:29 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:45:29 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:29 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:45:29 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:45:30 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:45:30 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:45:30 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:45:30 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:45:30 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:30 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:30 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:45:30 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:45:30 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:45:30 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:45:31 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:31 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:45:31 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:31 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:45:31 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:45:31 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:31 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:31 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:45:31 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:31 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:31 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:31 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:31 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:31 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:31 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:31 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:31 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:31 localhost openstack_network_exporter[239441]: ERROR 09:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:45:31 localhost openstack_network_exporter[239441]: Feb 1 04:45:31 localhost openstack_network_exporter[239441]: ERROR 09:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:45:31 localhost openstack_network_exporter[239441]: Feb 1 04:45:31 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:45:31 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:45:31 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:45:31 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:45:32 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:45:32 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:45:32 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:32 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:45:32 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:32 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:45:32 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:45:32 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:32 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:32 localhost nova_compute[274651]: 2026-02-01 09:45:32.195 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:32 localhost nova_compute[274651]: 2026-02-01 09:45:32.198 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:32 localhost nova_compute[274651]: 2026-02-01 09:45:32.198 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:32 localhost nova_compute[274651]: 2026-02-01 09:45:32.198 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:32 localhost nova_compute[274651]: 2026-02-01 09:45:32.220 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:32 localhost nova_compute[274651]: 2026-02-01 09:45:32.221 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:45:32 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Feb 1 04:45:32 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Feb 1 04:45:32 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:45:32 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:45:32 localhost podman[294476]: 2026-02-01 09:45:32.752055355 +0000 UTC m=+0.100961632 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64) Feb 1 04:45:32 localhost podman[294476]: 2026-02-01 09:45:32.7964794 +0000 UTC m=+0.145385677 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:45:32 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:45:33 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:33 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:45:33 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:33 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:45:33 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:45:33 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:33 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:33 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:33 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:33 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Feb 1 04:45:33 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Feb 1 04:45:33 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:45:33 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:45:34 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:34 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:45:34 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:34 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:45:34 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:45:34 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:34 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:34 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:45:34 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:45:34 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:45:34 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:45:35 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:35 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:45:35 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:35 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:45:35 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:45:35 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:35 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:35 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:35 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:35 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:35 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:45:35 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:45:35 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:45:35 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:45:36 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:36 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:36 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:36 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:45:36 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:36 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:45:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:45:37 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:37 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:37 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:37 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:37 localhost nova_compute[274651]: 2026-02-01 09:45:37.222 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:37 localhost nova_compute[274651]: 2026-02-01 09:45:37.224 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:37 localhost nova_compute[274651]: 2026-02-01 09:45:37.224 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:37 localhost nova_compute[274651]: 2026-02-01 09:45:37.224 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:37 localhost nova_compute[274651]: 2026-02-01 09:45:37.225 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:37 localhost nova_compute[274651]: 2026-02-01 09:45:37.230 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.264173) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137264241, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1374, "num_deletes": 250, "total_data_size": 3800210, "memory_usage": 3902312, "flush_reason": "Manual Compaction"} Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137283606, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2227760, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14774, "largest_seqno": 16142, "table_properties": {"data_size": 2221662, "index_size": 3179, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 17003, "raw_average_key_size": 23, "raw_value_size": 2208016, "raw_average_value_size": 2987, "num_data_blocks": 137, "num_entries": 739, "num_filter_entries": 739, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939109, "oldest_key_time": 1769939109, "file_creation_time": 1769939137, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 19467 microseconds, and 8100 cpu microseconds. Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.283654) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2227760 bytes OK Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.283677) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.286523) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.286552) EVENT_LOG_v1 {"time_micros": 1769939137286542, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.286572) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3793078, prev total WAL file size 3793827, number of live WAL files 2. Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.287744) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353039' seq:72057594037927935, type:22 .. '6D6772737461740033373630' seq:0, type:0; will stop at (end) Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2175KB)], [21(18MB)] Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137287792, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 21622423, "oldest_snapshot_seqno": -1} Feb 1 04:45:37 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:37 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10727 keys, 19443421 bytes, temperature: kUnknown Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137422059, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 19443421, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19382373, "index_size": 33107, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26885, "raw_key_size": 286365, "raw_average_key_size": 26, "raw_value_size": 19199410, "raw_average_value_size": 1789, "num_data_blocks": 1265, "num_entries": 10727, "num_filter_entries": 10727, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939137, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.422315) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 19443421 bytes Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.423888) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.9 rd, 144.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 18.5 +0.0 blob) out(18.5 +0.0 blob), read-write-amplify(18.4) write-amplify(8.7) OK, records in: 11256, records dropped: 529 output_compression: NoCompression Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.423918) EVENT_LOG_v1 {"time_micros": 1769939137423905, "job": 10, "event": "compaction_finished", "compaction_time_micros": 134344, "compaction_time_cpu_micros": 44576, "output_level": 6, "num_output_files": 1, "total_output_size": 19443421, "num_input_records": 11256, "num_output_records": 10727, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137424259, "job": 10, "event": "table_file_deletion", "file_number": 23} Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137426147, "job": 10, "event": "table_file_deletion", "file_number": 21} Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.287662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.426245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.426252) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.426255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.426259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:37 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:45:37.426263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:45:37 localhost ceph-mgr[278591]: [volumes INFO mgr_util] scanning for idle connections.. Feb 1 04:45:37 localhost ceph-mgr[278591]: [volumes INFO mgr_util] cleaning up connections: [] Feb 1 04:45:37 localhost ceph-mgr[278591]: [volumes INFO mgr_util] scanning for idle connections.. Feb 1 04:45:37 localhost ceph-mgr[278591]: [volumes INFO mgr_util] cleaning up connections: [] Feb 1 04:45:37 localhost ceph-mgr[278591]: [volumes INFO mgr_util] scanning for idle connections.. Feb 1 04:45:37 localhost ceph-mgr[278591]: [volumes INFO mgr_util] cleaning up connections: [] Feb 1 04:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:45:37 localhost podman[294496]: 2026-02-01 09:45:37.714772874 +0000 UTC m=+0.075712817 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute) Feb 1 04:45:37 localhost podman[294496]: 2026-02-01 09:45:37.751883663 +0000 UTC m=+0.112823586 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:45:37 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:45:38 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:38 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:38 localhost ceph-mgr[278591]: [progress INFO root] update: starting ev fae25ee6-03b6-4bc5-987a-3d2eb49ef1d7 (Updating node-proxy deployment (+4 -> 4)) Feb 1 04:45:38 localhost ceph-mgr[278591]: [progress INFO root] complete: finished ev fae25ee6-03b6-4bc5-987a-3d2eb49ef1d7 (Updating node-proxy deployment (+4 -> 4)) Feb 1 04:45:38 localhost ceph-mgr[278591]: [progress INFO root] Completed event fae25ee6-03b6-4bc5-987a-3d2eb49ef1d7 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Feb 1 04:45:39 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:39 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:39 localhost nova_compute[274651]: 2026-02-01 09:45:39.301 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:39 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:39 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:39 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:45:39 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:39 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:39 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:39 localhost nova_compute[274651]: 2026-02-01 09:45:39.390 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Triggering sync for uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 1 04:45:39 localhost nova_compute[274651]: 2026-02-01 09:45:39.391 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:45:39 localhost nova_compute[274651]: 2026-02-01 09:45:39.391 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:45:39 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:39 localhost nova_compute[274651]: 2026-02-01 09:45:39.548 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:45:40 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:40 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:40 localhost nova_compute[274651]: 2026-02-01 09:45:40.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:41 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:41 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:41 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:41 localhost nova_compute[274651]: 2026-02-01 09:45:41.285 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:41 localhost nova_compute[274651]: 2026-02-01 09:45:41.285 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:45:41 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:41 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:45:41.709 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:45:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:45:41.709 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:45:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:45:41.710 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:45:42 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:42 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:42 localhost nova_compute[274651]: 2026-02-01 09:45:42.229 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:42 localhost nova_compute[274651]: 2026-02-01 09:45:42.231 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:42 localhost nova_compute[274651]: 2026-02-01 09:45:42.232 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:42 localhost nova_compute[274651]: 2026-02-01 09:45:42.232 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:42 localhost nova_compute[274651]: 2026-02-01 09:45:42.251 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:42 localhost nova_compute[274651]: 2026-02-01 09:45:42.252 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:42 localhost nova_compute[274651]: 2026-02-01 09:45:42.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:42 localhost ceph-mgr[278591]: [progress INFO root] Writing back 50 completed events Feb 1 04:45:43 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:43 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.267 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.296 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.296 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.296 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.297 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:45:43 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:43 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:43 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:43 localhost ceph-mgr[278591]: log_channel(audit) log [DBG] : from='client.34462 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch Feb 1 04:45:43 localhost ceph-mgr[278591]: [cephadm INFO root] Reconfig service osd.default_drive_group Feb 1 04:45:43 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group Feb 1 04:45:43 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:45:43 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3638642379' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.781 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.846 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:45:43 localhost nova_compute[274651]: 2026-02-01 09:45:43.846 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.064 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.067 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11496MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.067 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.068 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:45:44 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:44 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.317 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.318 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.318 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.409 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.506 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.506 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.524 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.566 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI2,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:45:44 localhost ceph-mon[286721]: Reconfig service osd.default_drive_group Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:44 localhost nova_compute[274651]: 2026-02-01 09:45:44.630 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:45:45 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:45:45 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/915385531' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:45:45 localhost nova_compute[274651]: 2026-02-01 09:45:45.077 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:45:45 localhost nova_compute[274651]: 2026-02-01 09:45:45.083 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:45:45 localhost nova_compute[274651]: 2026-02-01 09:45:45.103 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:45:45 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:45 localhost nova_compute[274651]: 2026-02-01 09:45:45.106 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:45:45 localhost nova_compute[274651]: 2026-02-01 09:45:45.106 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:45:45 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:45 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:45 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:45 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:45 localhost ceph-mgr[278591]: [progress INFO root] update: starting ev f22507f2-233c-4110-b1ee-7475611e9caa (Updating node-proxy deployment (+4 -> 4)) Feb 1 04:45:45 localhost ceph-mgr[278591]: [progress INFO root] complete: finished ev f22507f2-233c-4110-b1ee-7475611e9caa (Updating node-proxy deployment (+4 -> 4)) Feb 1 04:45:45 localhost ceph-mgr[278591]: [progress INFO root] Completed event f22507f2-233c-4110-b1ee-7475611e9caa (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Feb 1 04:45:45 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:45 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:45 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:45:45 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:45 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:45:45 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:45:46 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:46 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:46 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.289 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.290 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.291 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:45:46 localhost podman[294649]: Feb 1 04:45:46 localhost podman[294649]: 2026-02-01 09:45:46.453636293 +0000 UTC m=+0.053102892 container create 83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_pasteur, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, build-date=2025-12-08T17:28:53Z, ceph=True, vcs-type=git) Feb 1 04:45:46 localhost systemd[1]: Started libpod-conmon-83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd.scope. Feb 1 04:45:46 localhost systemd[1]: Started libcrun container. Feb 1 04:45:46 localhost podman[294649]: 2026-02-01 09:45:46.517523835 +0000 UTC m=+0.116990424 container init 83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_pasteur, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.41.4, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git) Feb 1 04:45:46 localhost podman[294649]: 2026-02-01 09:45:46.526105878 +0000 UTC m=+0.125572477 container start 83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_pasteur, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, release=1764794109, maintainer=Guillaume Abrioux , version=7) Feb 1 04:45:46 localhost podman[294649]: 2026-02-01 09:45:46.526337676 +0000 UTC m=+0.125804285 container attach 83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_pasteur, build-date=2025-12-08T17:28:53Z, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64) Feb 1 04:45:46 localhost fervent_pasteur[294663]: 167 167 Feb 1 04:45:46 localhost podman[294649]: 2026-02-01 09:45:46.433631358 +0000 UTC m=+0.033097957 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:45:46 localhost systemd[1]: libpod-83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd.scope: Deactivated successfully. Feb 1 04:45:46 localhost podman[294649]: 2026-02-01 09:45:46.534873728 +0000 UTC m=+0.134340337 container died 83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_pasteur, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , version=7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:45:46 localhost podman[294668]: 2026-02-01 09:45:46.621101176 +0000 UTC m=+0.077245103 container remove 83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_pasteur, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1764794109, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_CLEAN=True, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux ) Feb 1 04:45:46 localhost systemd[1]: libpod-conmon-83b0259e7fb362530116a3bea3c074612bfcff27197380f5711baf4d70908dcd.scope: Deactivated successfully. Feb 1 04:45:46 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:45:46 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.840 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:45:46 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:45:46 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.843 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.843 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:45:46 localhost nova_compute[274651]: 2026-02-01 09:45:46.844 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:45:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:45:46 localhost podman[294710]: 2026-02-01 09:45:46.994046431 +0000 UTC m=+0.071511987 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:45:47 localhost podman[294710]: 2026-02-01 09:45:47.03340701 +0000 UTC m=+0.110872506 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:45:47 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:45:47 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:47 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.253 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.255 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.255 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.255 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.310 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.311 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:47 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:47 localhost podman[294765]: Feb 1 04:45:47 localhost podman[294765]: 2026-02-01 09:45:47.415902287 +0000 UTC m=+0.074778798 container create 7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_shockley, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, name=rhceph, release=1764794109, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:45:47 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:47 localhost ceph-mgr[278591]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail Feb 1 04:45:47 localhost systemd[1]: Started libpod-conmon-7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7.scope. Feb 1 04:45:47 localhost ceph-mgr[278591]: [progress INFO root] Writing back 50 completed events Feb 1 04:45:47 localhost systemd[1]: var-lib-containers-storage-overlay-72092f9068297a57d06d43379689b7c01502d4a4b087c74f52021adffad07407-merged.mount: Deactivated successfully. Feb 1 04:45:47 localhost podman[294765]: 2026-02-01 09:45:47.385405381 +0000 UTC m=+0.044281922 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:45:47 localhost systemd[1]: Started libcrun container. Feb 1 04:45:47 localhost podman[294765]: 2026-02-01 09:45:47.502320732 +0000 UTC m=+0.161197273 container init 7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_shockley, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:45:47 localhost podman[294765]: 2026-02-01 09:45:47.512651489 +0000 UTC m=+0.171527980 container start 7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_shockley, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1764794109, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Feb 1 04:45:47 localhost podman[294765]: 2026-02-01 09:45:47.512849195 +0000 UTC m=+0.171725746 container attach 7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_shockley, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, build-date=2025-12-08T17:28:53Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:45:47 localhost confident_shockley[294780]: 167 167 Feb 1 04:45:47 localhost systemd[1]: libpod-7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7.scope: Deactivated successfully. Feb 1 04:45:47 localhost podman[294765]: 2026-02-01 09:45:47.516623061 +0000 UTC m=+0.175499552 container died 7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_shockley, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, release=1764794109, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, ceph=True) Feb 1 04:45:47 localhost systemd[1]: var-lib-containers-storage-overlay-a97a47ed1854d693644fa3b08484a3187296454c316ef9cfd43af9a2e7674f2f-merged.mount: Deactivated successfully. Feb 1 04:45:47 localhost podman[294785]: 2026-02-01 09:45:47.604332734 +0000 UTC m=+0.075475159 container remove 7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_shockley, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, ceph=True, io.buildah.version=1.41.4) Feb 1 04:45:47 localhost systemd[1]: libpod-conmon-7a9f77dea497762cba53f2db6f6588a8e00035f8e72b85e91cea79b7cdb71cd7.scope: Deactivated successfully. Feb 1 04:45:47 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:47 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:47 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:47 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:47 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:45:47 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:45:47 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.813 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:45:47 localhost ceph-mgr[278591]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:45:47 localhost ceph-mgr[278591]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.884 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.885 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.886 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:47 localhost nova_compute[274651]: 2026-02-01 09:45:47.886 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 04:45:48 localhost nova_compute[274651]: 2026-02-01 09:45:48.032 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 04:45:48 localhost nova_compute[274651]: 2026-02-01 09:45:48.033 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:48 localhost nova_compute[274651]: 2026-02-01 09:45:48.034 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect) Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory Feb 1 04:45:48 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e87 e87: 6 total, 6 up, 6 in Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr handle_mgr_map I was active but no longer am Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn e: '/usr/bin/ceph-mgr' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 0: '/usr/bin/ceph-mgr' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 1: '-n' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 2: 'mgr.np0005604212.oynhpm' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 3: '-f' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 4: '--setuser' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 5: 'ceph' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 6: '--setgroup' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 7: 'ceph' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 8: '--default-log-to-file=false' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 9: '--default-log-to-journald=true' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn 10: '--default-log-to-stderr=false' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn respawning with exe /usr/bin/ceph-mgr Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr respawn exe_path /proc/self/exe Feb 1 04:45:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:48.363+0000 7f91f254c640 -1 mgr handle_mgr_map I was active but no longer am Feb 1 04:45:48 localhost systemd[1]: session-70.scope: Deactivated successfully. Feb 1 04:45:48 localhost systemd[1]: session-70.scope: Consumed 12.098s CPU time. Feb 1 04:45:48 localhost systemd-logind[759]: Session 70 logged out. Waiting for processes to exit. Feb 1 04:45:48 localhost systemd-logind[759]: Removed session 70. Feb 1 04:45:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: ignoring --setuser ceph since I am not root Feb 1 04:45:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: ignoring --setgroup ceph since I am not root Feb 1 04:45:48 localhost ceph-mgr[278591]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Feb 1 04:45:48 localhost ceph-mgr[278591]: pidfile_write: ignore empty --pid-file Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'alerts' Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'balancer' Feb 1 04:45:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:48.563+0000 7f7f75797140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 1 04:45:48 localhost ceph-mgr[278591]: mgr[py] Loading python module 'cephadm' Feb 1 04:45:48 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:48.638+0000 7f7f75797140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 1 04:45:48 localhost nova_compute[274651]: 2026-02-01 09:45:48.693 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:48 localhost sshd[294832]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:45:48 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:45:48 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/3579560302' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:45:48 localhost ceph-mon[286721]: Activating manager daemon np0005604215.uhhqtv Feb 1 04:45:48 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/3579560302' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:45:48 localhost ceph-mon[286721]: Manager daemon np0005604215.uhhqtv is now available Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch Feb 1 04:45:48 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch Feb 1 04:45:48 localhost systemd-logind[759]: New session 71 of user ceph-admin. Feb 1 04:45:48 localhost systemd[1]: Started Session 71 of User ceph-admin. Feb 1 04:45:49 localhost ceph-mgr[278591]: mgr[py] Loading python module 'crash' Feb 1 04:45:49 localhost nova_compute[274651]: 2026-02-01 09:45:49.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:45:49 localhost ceph-mgr[278591]: mgr[py] Module crash has missing NOTIFY_TYPES member Feb 1 04:45:49 localhost ceph-mgr[278591]: mgr[py] Loading python module 'dashboard' Feb 1 04:45:49 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:49.303+0000 7f7f75797140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Feb 1 04:45:49 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:49 localhost ceph-mgr[278591]: mgr[py] Loading python module 'devicehealth' Feb 1 04:45:49 localhost ceph-mgr[278591]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 1 04:45:49 localhost ceph-mgr[278591]: mgr[py] Loading python module 'diskprediction_local' Feb 1 04:45:49 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:49.871+0000 7f7f75797140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 1 04:45:49 localhost systemd[1]: tmp-crun.ogY5bJ.mount: Deactivated successfully. Feb 1 04:45:49 localhost podman[294949]: 2026-02-01 09:45:49.925590308 +0000 UTC m=+0.102318094 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, release=1764794109, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:45:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Feb 1 04:45:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Feb 1 04:45:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: from numpy import show_config as show_numpy_config Feb 1 04:45:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:50.019+0000 7f7f75797140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'influx' Feb 1 04:45:50 localhost podman[294949]: 2026-02-01 09:45:50.033919235 +0000 UTC m=+0.210647071 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Module influx has missing NOTIFY_TYPES member Feb 1 04:45:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:50.081+0000 7f7f75797140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'insights' Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'iostat' Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'k8sevents' Feb 1 04:45:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:50.209+0000 7f7f75797140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'localpool' Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'mds_autoscaler' Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'mirroring' Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'nfs' Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 1 04:45:50 localhost ceph-mgr[278591]: mgr[py] Loading python module 'orchestrator' Feb 1 04:45:50 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:50.984+0000 7f7f75797140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'osd_perf_query' Feb 1 04:45:51 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:51.144+0000 7f7f75797140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:51.211+0000 7f7f75797140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'osd_support' Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:51.267+0000 7f7f75797140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'pg_autoscaler' Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'progress' Feb 1 04:45:51 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:51.335+0000 7f7f75797140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Module progress has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'prometheus' Feb 1 04:45:51 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:51.399+0000 7f7f75797140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:51 localhost ceph-mon[286721]: [01/Feb/2026:09:45:49] ENGINE Bus STARTING Feb 1 04:45:51 localhost ceph-mon[286721]: [01/Feb/2026:09:45:49] ENGINE Serving on https://172.18.0.108:7150 Feb 1 04:45:51 localhost ceph-mon[286721]: [01/Feb/2026:09:45:49] ENGINE Client ('172.18.0.108', 56488) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:45:51 localhost ceph-mon[286721]: [01/Feb/2026:09:45:49] ENGINE Serving on http://172.18.0.108:8765 Feb 1 04:45:51 localhost ceph-mon[286721]: [01/Feb/2026:09:45:49] ENGINE Bus STARTED Feb 1 04:45:51 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:45:51 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:45:51 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:51 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 1 04:45:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:51.709+0000 7f7f75797140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'rbd_support' Feb 1 04:45:51 localhost systemd[1]: tmp-crun.wiQZ6E.mount: Deactivated successfully. Feb 1 04:45:51 localhost podman[295147]: 2026-02-01 09:45:51.732236756 +0000 UTC m=+0.084689232 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 1 04:45:51 localhost podman[295147]: 2026-02-01 09:45:51.762252578 +0000 UTC m=+0.114705054 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:45:51 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'restful' Feb 1 04:45:51 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:51.795+0000 7f7f75797140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 1 04:45:51 localhost ceph-mgr[278591]: mgr[py] Loading python module 'rgw' Feb 1 04:45:52 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:52.132+0000 7f7f75797140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Loading python module 'rook' Feb 1 04:45:52 localhost nova_compute[274651]: 2026-02-01 09:45:52.342 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:52 localhost nova_compute[274651]: 2026-02-01 09:45:52.344 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:52 localhost nova_compute[274651]: 2026-02-01 09:45:52.344 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:52 localhost nova_compute[274651]: 2026-02-01 09:45:52.344 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:52 localhost nova_compute[274651]: 2026-02-01 09:45:52.345 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:52 localhost nova_compute[274651]: 2026-02-01 09:45:52.346 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:45:52 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Module rook has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:52.579+0000 7f7f75797140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Loading python module 'selftest' Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Loading python module 'snap_schedule' Feb 1 04:45:52 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:52.641+0000 7f7f75797140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Loading python module 'stats' Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Loading python module 'status' Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Module status has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Loading python module 'telegraf' Feb 1 04:45:52 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:52.843+0000 7f7f75797140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:52.908+0000 7f7f75797140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 1 04:45:52 localhost ceph-mgr[278591]: mgr[py] Loading python module 'telemetry' Feb 1 04:45:52 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:45:52 localhost ceph-mon[286721]: paxos.2).electionLogic(52) init, last seen epoch 52 Feb 1 04:45:52 localhost ceph-mon[286721]: mon.np0005604212@2(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:52 localhost ceph-mon[286721]: mon.np0005604212@2(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:53 localhost ceph-mgr[278591]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-mgr[278591]: mgr[py] Loading python module 'test_orchestrator' Feb 1 04:45:53 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:53.049+0000 7f7f75797140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-mgr[278591]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-mgr[278591]: mgr[py] Loading python module 'volumes' Feb 1 04:45:53 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:53.223+0000 7f7f75797140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-mgr[278591]: mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:53.439+0000 7f7f75797140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-mgr[278591]: mgr[py] Loading python module 'zabbix' Feb 1 04:45:53 localhost ceph-mgr[278591]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604212-oynhpm[278587]: 2026-02-01T09:45:53.501+0000 7f7f75797140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 1 04:45:53 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x55ea79415600 mon_map magic: 0 from mon.2 v2:172.18.0.103:3300/0 Feb 1 04:45:53 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3247174031 Feb 1 04:45:53 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x55ea794151e0 mon_map magic: 0 from mon.2 v2:172.18.0.103:3300/0 Feb 1 04:45:53 localhost podman[236886]: time="2026-02-01T09:45:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:45:53 localhost podman[236886]: @ - - [01/Feb/2026:09:45:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:45:53 localhost podman[236886]: @ - - [01/Feb/2026:09:45:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18801 "" "Go-http-client/1.1" Feb 1 04:45:54 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3247174031 Feb 1 04:45:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:45:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:45:54 localhost podman[295851]: 2026-02-01 09:45:54.732948308 +0000 UTC m=+0.087705574 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller) Feb 1 04:45:54 localhost podman[295850]: 2026-02-01 09:45:54.798377447 +0000 UTC m=+0.155606279 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:45:54 localhost podman[295851]: 2026-02-01 09:45:54.832382943 +0000 UTC m=+0.187140149 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Feb 1 04:45:54 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:45:54 localhost podman[295850]: 2026-02-01 09:45:54.887518625 +0000 UTC m=+0.244747497 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:45:54 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:45:55 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3247174031 Feb 1 04:45:56 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3247174031 Feb 1 04:45:57 localhost nova_compute[274651]: 2026-02-01 09:45:57.347 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:57 localhost nova_compute[274651]: 2026-02-01 09:45:57.349 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:45:57 localhost nova_compute[274651]: 2026-02-01 09:45:57.349 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:45:57 localhost nova_compute[274651]: 2026-02-01 09:45:57.349 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:57 localhost nova_compute[274651]: 2026-02-01 09:45:57.377 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:45:57 localhost nova_compute[274651]: 2026-02-01 09:45:57.378 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:45:57 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.108:6810/3247174031 Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604212 in quorum (ranks 0,1,2) Feb 1 04:45:58 localhost ceph-mon[286721]: Health check failed: 1/4 mons down, quorum np0005604211,np0005604215,np0005604212 (MON_DOWN) Feb 1 04:45:58 localhost ceph-mon[286721]: Health detail: HEALTH_WARN 1/4 mons down, quorum np0005604211,np0005604215,np0005604212 Feb 1 04:45:58 localhost ceph-mon[286721]: [WRN] MON_DOWN: 1/4 mons down, quorum np0005604211,np0005604215,np0005604212 Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604213 (rank 3) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Feb 1 04:45:58 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:58 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604212@2(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604212@2(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:58 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:45:59 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:45:59 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:45:59 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:45:59 localhost ceph-mon[286721]: mon.np0005604211 calling monitor election Feb 1 04:45:59 localhost ceph-mon[286721]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604212,np0005604213 in quorum (ranks 0,1,2,3) Feb 1 04:45:59 localhost ceph-mon[286721]: Health check cleared: MON_DOWN (was: 1/4 mons down, quorum np0005604211,np0005604215,np0005604212) Feb 1 04:45:59 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:45:59 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:45:59 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:45:59 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:45:59 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:00 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:46:00 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:46:00 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:00 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:00 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:00 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:01 localhost podman[295971]: Feb 1 04:46:01 localhost podman[295971]: 2026-02-01 09:46:01.050551441 +0000 UTC m=+0.078615705 container create b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_morse, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, architecture=x86_64, version=7) Feb 1 04:46:01 localhost systemd[1]: Started libpod-conmon-b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059.scope. Feb 1 04:46:01 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:01 localhost podman[295971]: 2026-02-01 09:46:01.018197708 +0000 UTC m=+0.046261982 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:01 localhost systemd[1]: Started libcrun container. Feb 1 04:46:01 localhost podman[295971]: 2026-02-01 09:46:01.154864205 +0000 UTC m=+0.182928469 container init b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_morse, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:46:01 localhost podman[295971]: 2026-02-01 09:46:01.167585306 +0000 UTC m=+0.195649580 container start b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_morse, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:46:01 localhost podman[295971]: 2026-02-01 09:46:01.167908376 +0000 UTC m=+0.195972680 container attach b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_morse, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=1764794109) Feb 1 04:46:01 localhost angry_morse[295986]: 167 167 Feb 1 04:46:01 localhost systemd[1]: libpod-b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059.scope: Deactivated successfully. Feb 1 04:46:01 localhost podman[295971]: 2026-02-01 09:46:01.173871299 +0000 UTC m=+0.201935583 container died b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_morse, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:46:01 localhost podman[295991]: 2026-02-01 09:46:01.276714808 +0000 UTC m=+0.094362400 container remove b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_morse, build-date=2025-12-08T17:28:53Z, distribution-scope=public, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, ceph=True, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , GIT_BRANCH=main, RELEASE=main, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Feb 1 04:46:01 localhost systemd[1]: libpod-conmon-b40732c5f16fa59683b6b7376951ec8ac2d50e80e1bbb4ccac9606158a921059.scope: Deactivated successfully. Feb 1 04:46:01 localhost ceph-mon[286721]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:46:01 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:46:01 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:01 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:01 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:46:01 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:01 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:01 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:46:01 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:46:01 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:46:01 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:01 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:01 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:46:01 localhost openstack_network_exporter[239441]: ERROR 09:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:46:01 localhost openstack_network_exporter[239441]: Feb 1 04:46:01 localhost openstack_network_exporter[239441]: ERROR 09:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:46:01 localhost openstack_network_exporter[239441]: Feb 1 04:46:02 localhost podman[296059]: Feb 1 04:46:02 localhost podman[296059]: 2026-02-01 09:46:02.029440367 +0000 UTC m=+0.071193039 container create 121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_panini, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, ceph=True, build-date=2025-12-08T17:28:53Z, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:46:02 localhost systemd[1]: var-lib-containers-storage-overlay-5deb992158c1ff4dcbb48a72ecf1dedd1b7d91c8737451c8dc164e47bb384311-merged.mount: Deactivated successfully. Feb 1 04:46:02 localhost systemd[1]: Started libpod-conmon-121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712.scope. Feb 1 04:46:02 localhost systemd[1]: Started libcrun container. Feb 1 04:46:02 localhost podman[296059]: 2026-02-01 09:46:01.995317478 +0000 UTC m=+0.037070190 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:02 localhost podman[296059]: 2026-02-01 09:46:02.103615084 +0000 UTC m=+0.145367756 container init 121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_panini, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , RELEASE=main, io.buildah.version=1.41.4) Feb 1 04:46:02 localhost podman[296059]: 2026-02-01 09:46:02.115035375 +0000 UTC m=+0.156788057 container start 121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_panini, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, name=rhceph, release=1764794109, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:46:02 localhost podman[296059]: 2026-02-01 09:46:02.115513489 +0000 UTC m=+0.157266161 container attach 121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_panini, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, vendor=Red Hat, Inc., ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:46:02 localhost condescending_panini[296074]: 167 167 Feb 1 04:46:02 localhost systemd[1]: libpod-121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712.scope: Deactivated successfully. Feb 1 04:46:02 localhost podman[296059]: 2026-02-01 09:46:02.121825894 +0000 UTC m=+0.163578576 container died 121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_panini, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, release=1764794109, ceph=True) Feb 1 04:46:02 localhost podman[296079]: 2026-02-01 09:46:02.21938672 +0000 UTC m=+0.088056076 container remove 121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=condescending_panini, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, release=1764794109, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, distribution-scope=public) Feb 1 04:46:02 localhost systemd[1]: libpod-conmon-121b3c4f268d3dfe72e8dd2518a5d95f2995227f7190283e77ff928915373712.scope: Deactivated successfully. Feb 1 04:46:02 localhost nova_compute[274651]: 2026-02-01 09:46:02.380 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:02 localhost nova_compute[274651]: 2026-02-01 09:46:02.382 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:02 localhost nova_compute[274651]: 2026-02-01 09:46:02.383 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:02 localhost nova_compute[274651]: 2026-02-01 09:46:02.383 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:02 localhost nova_compute[274651]: 2026-02-01 09:46:02.418 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:02 localhost nova_compute[274651]: 2026-02-01 09:46:02.419 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:02 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:46:02 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:46:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:46:02 localhost podman[296139]: 2026-02-01 09:46:02.992794914 +0000 UTC m=+0.095129183 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1769056855, distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 04:46:03 localhost podman[296139]: 2026-02-01 09:46:03.013394786 +0000 UTC m=+0.115729095 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.) Feb 1 04:46:03 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:46:03 localhost systemd[1]: var-lib-containers-storage-overlay-10298f605adcc579e74aaa740181fd77d1f45f5bddca80ea08a35cbc2de5e4a4-merged.mount: Deactivated successfully. Feb 1 04:46:03 localhost podman[296172]: Feb 1 04:46:03 localhost podman[296172]: 2026-02-01 09:46:03.083281183 +0000 UTC m=+0.087123567 container create e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tu, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1764794109, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7) Feb 1 04:46:03 localhost systemd[1]: Started libpod-conmon-e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e.scope. Feb 1 04:46:03 localhost podman[296172]: 2026-02-01 09:46:03.048287728 +0000 UTC m=+0.052130102 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:03 localhost systemd[1]: Started libcrun container. Feb 1 04:46:03 localhost podman[296172]: 2026-02-01 09:46:03.170528233 +0000 UTC m=+0.174370577 container init e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tu, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1764794109, version=7, GIT_CLEAN=True) Feb 1 04:46:03 localhost podman[296172]: 2026-02-01 09:46:03.181407397 +0000 UTC m=+0.185249741 container start e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tu, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:46:03 localhost podman[296172]: 2026-02-01 09:46:03.181709346 +0000 UTC m=+0.185551690 container attach e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tu, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., release=1764794109, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 1 04:46:03 localhost amazing_tu[296191]: 167 167 Feb 1 04:46:03 localhost systemd[1]: libpod-e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e.scope: Deactivated successfully. Feb 1 04:46:03 localhost podman[296172]: 2026-02-01 09:46:03.184550453 +0000 UTC m=+0.188392767 container died e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tu, version=7, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:46:03 localhost podman[296196]: 2026-02-01 09:46:03.286083651 +0000 UTC m=+0.085670521 container remove e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tu, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, ceph=True, com.redhat.component=rhceph-container) Feb 1 04:46:03 localhost systemd[1]: libpod-conmon-e7b4bed1a8b3edb103b4384fdb75693bc04f03463157d4511a8e167a68562e8e.scope: Deactivated successfully. Feb 1 04:46:03 localhost ceph-mon[286721]: Saving service mon spec with placement label:mon Feb 1 04:46:03 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:03 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:03 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:03 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:46:03 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:46:03 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:46:04 localhost systemd[1]: var-lib-containers-storage-overlay-5e8cc42c57fe85d15f8f51c98b4442cd1555aeac2ce2d56f1a0b13d4b5915af1-merged.mount: Deactivated successfully. Feb 1 04:46:04 localhost podman[296271]: Feb 1 04:46:04 localhost podman[296271]: 2026-02-01 09:46:04.147584171 +0000 UTC m=+0.065385080 container create 32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_roentgen, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, ceph=True, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:46:04 localhost systemd[1]: Started libpod-conmon-32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121.scope. Feb 1 04:46:04 localhost systemd[1]: Started libcrun container. Feb 1 04:46:04 localhost podman[296271]: 2026-02-01 09:46:04.205105777 +0000 UTC m=+0.122906716 container init 32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_roentgen, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:46:04 localhost systemd[1]: tmp-crun.mPQhw9.mount: Deactivated successfully. Feb 1 04:46:04 localhost podman[296271]: 2026-02-01 09:46:04.220506701 +0000 UTC m=+0.138307640 container start 32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_roentgen, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, build-date=2025-12-08T17:28:53Z, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public) Feb 1 04:46:04 localhost podman[296271]: 2026-02-01 09:46:04.220891763 +0000 UTC m=+0.138692722 container attach 32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_roentgen, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, RELEASE=main, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1764794109) Feb 1 04:46:04 localhost dazzling_roentgen[296286]: 167 167 Feb 1 04:46:04 localhost systemd[1]: libpod-32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121.scope: Deactivated successfully. Feb 1 04:46:04 localhost podman[296271]: 2026-02-01 09:46:04.223752681 +0000 UTC m=+0.141553660 container died 32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_roentgen, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7) Feb 1 04:46:04 localhost podman[296271]: 2026-02-01 09:46:04.126608367 +0000 UTC m=+0.044409326 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:04 localhost podman[296291]: 2026-02-01 09:46:04.315837549 +0000 UTC m=+0.080307598 container remove 32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_roentgen, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, com.redhat.component=rhceph-container, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:46:04 localhost systemd[1]: libpod-conmon-32e9af478af9a54e27eac733ade3be50ee0b578ffa45d9ab082c38c4ca662121.scope: Deactivated successfully. Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:04 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:04 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:04 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:04 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:04 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:46:04 localhost podman[296361]: Feb 1 04:46:04 localhost podman[296361]: 2026-02-01 09:46:04.985680652 +0000 UTC m=+0.059068775 container create 955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_thompson, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109) Feb 1 04:46:05 localhost systemd[1]: Started libpod-conmon-955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8.scope. Feb 1 04:46:05 localhost systemd[1]: Started libcrun container. Feb 1 04:46:05 localhost podman[296361]: 2026-02-01 09:46:05.046063666 +0000 UTC m=+0.119451779 container init 955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_thompson, version=7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1764794109, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:46:05 localhost podman[296361]: 2026-02-01 09:46:04.955160094 +0000 UTC m=+0.028548257 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:05 localhost podman[296361]: 2026-02-01 09:46:05.054776664 +0000 UTC m=+0.128164787 container start 955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_thompson, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, RELEASE=main, version=7, name=rhceph, distribution-scope=public, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Feb 1 04:46:05 localhost podman[296361]: 2026-02-01 09:46:05.055130985 +0000 UTC m=+0.128519158 container attach 955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_thompson, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, build-date=2025-12-08T17:28:53Z, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, io.openshift.expose-services=, version=7) Feb 1 04:46:05 localhost agitated_thompson[296376]: 167 167 Feb 1 04:46:05 localhost podman[296361]: 2026-02-01 09:46:05.05889714 +0000 UTC m=+0.132285323 container died 955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_thompson, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=) Feb 1 04:46:05 localhost systemd[1]: var-lib-containers-storage-overlay-4bf5f3c0aeda0151529734c1789fb66104cdbf008724d29e9670e0a6e981ba10-merged.mount: Deactivated successfully. Feb 1 04:46:05 localhost systemd[1]: libpod-955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8.scope: Deactivated successfully. Feb 1 04:46:05 localhost systemd[1]: var-lib-containers-storage-overlay-b9a49ca3c3792e0854a45d47d9b3f2288dce66b5747dfb44c061493bbcba34e6-merged.mount: Deactivated successfully. Feb 1 04:46:05 localhost podman[296381]: 2026-02-01 09:46:05.171623142 +0000 UTC m=+0.099413434 container remove 955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_thompson, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, version=7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vcs-type=git, GIT_CLEAN=True) Feb 1 04:46:05 localhost systemd[1]: libpod-conmon-955110a1064415ebe5d4ca21ae6ecf5e29e207d467cda04d8f19d9bdbae53de8.scope: Deactivated successfully. Feb 1 04:46:05 localhost podman[296450]: Feb 1 04:46:05 localhost podman[296450]: 2026-02-01 09:46:05.869956661 +0000 UTC m=+0.082290889 container create fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_newton, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:46:05 localhost systemd[1]: Started libpod-conmon-fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07.scope. Feb 1 04:46:05 localhost systemd[1]: Started libcrun container. Feb 1 04:46:05 localhost podman[296450]: 2026-02-01 09:46:05.92689587 +0000 UTC m=+0.139230078 container init fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_newton, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:46:05 localhost podman[296450]: 2026-02-01 09:46:05.936638748 +0000 UTC m=+0.148972926 container start fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_newton, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, release=1764794109, io.buildah.version=1.41.4, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-12-08T17:28:53Z, distribution-scope=public) Feb 1 04:46:05 localhost podman[296450]: 2026-02-01 09:46:05.83672105 +0000 UTC m=+0.049055308 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:05 localhost podman[296450]: 2026-02-01 09:46:05.936932427 +0000 UTC m=+0.149266665 container attach fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_newton, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, ceph=True) Feb 1 04:46:05 localhost quizzical_newton[296465]: 167 167 Feb 1 04:46:05 localhost systemd[1]: libpod-fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07.scope: Deactivated successfully. Feb 1 04:46:05 localhost podman[296450]: 2026-02-01 09:46:05.939978722 +0000 UTC m=+0.152312930 container died fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_newton, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:46:06 localhost podman[296470]: 2026-02-01 09:46:06.031578734 +0000 UTC m=+0.082603107 container remove fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_newton, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=) Feb 1 04:46:06 localhost systemd[1]: libpod-conmon-fe2eb8ca5aac2cc9e2776a0517d1f065dd05134262cda1ab7bdb81f780c65e07.scope: Deactivated successfully. Feb 1 04:46:06 localhost systemd[1]: var-lib-containers-storage-overlay-d5368cc1e74df812e413a5a13e2a3102f005b915ebb9bf7b04be3d2d0d4bf41c-merged.mount: Deactivated successfully. Feb 1 04:46:06 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:06 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:06 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:06 localhost ceph-mon[286721]: Reconfiguring mon.np0005604212 (monmap changed)... Feb 1 04:46:06 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:46:06 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:46:06 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:06 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:06 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:06 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:07 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:46:07 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:46:07 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:07 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:07 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:46:07 localhost nova_compute[274651]: 2026-02-01 09:46:07.420 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:07 localhost nova_compute[274651]: 2026-02-01 09:46:07.423 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:07 localhost nova_compute[274651]: 2026-02-01 09:46:07.424 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:07 localhost nova_compute[274651]: 2026-02-01 09:46:07.424 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:07 localhost nova_compute[274651]: 2026-02-01 09:46:07.441 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:07 localhost nova_compute[274651]: 2026-02-01 09:46:07.442 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:08 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:46:08 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:46:08 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:08 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:08 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:08 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:08 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:46:08 localhost podman[296487]: 2026-02-01 09:46:08.73905902 +0000 UTC m=+0.090590474 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:46:08 localhost podman[296487]: 2026-02-01 09:46:08.753431891 +0000 UTC m=+0.104963355 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:46:08 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:46:09 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:46:09 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:46:09 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:09 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:09 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:09 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:09 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:09 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:10 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e12 handle_command mon_command({"prefix": "mgr stat", "format": "json"} v 0) Feb 1 04:46:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2918964831' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch Feb 1 04:46:10 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:46:10 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:46:10 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:10 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:10 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:10 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:11 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:11 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:46:11 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:46:11 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:11 localhost ceph-mon[286721]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:46:11 localhost ceph-mon[286721]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:46:11 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e88 e88: 6 total, 6 up, 6 in Feb 1 04:46:12 localhost systemd[1]: session-71.scope: Deactivated successfully. Feb 1 04:46:12 localhost systemd[1]: session-71.scope: Consumed 9.502s CPU time. Feb 1 04:46:12 localhost systemd-logind[759]: Session 71 logged out. Waiting for processes to exit. Feb 1 04:46:12 localhost systemd-logind[759]: Removed session 71. Feb 1 04:46:12 localhost sshd[296506]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:46:12 localhost nova_compute[274651]: 2026-02-01 09:46:12.442 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:12 localhost nova_compute[274651]: 2026-02-01 09:46:12.443 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:12 localhost nova_compute[274651]: 2026-02-01 09:46:12.444 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:12 localhost nova_compute[274651]: 2026-02-01 09:46:12.444 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:12 localhost nova_compute[274651]: 2026-02-01 09:46:12.444 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:12 localhost nova_compute[274651]: 2026-02-01 09:46:12.446 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:12 localhost ceph-mon[286721]: Reconfiguring mon.np0005604213 (monmap changed)... Feb 1 04:46:12 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:46:12 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/1843935985' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:46:12 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:46:12 localhost ceph-mon[286721]: Activating manager daemon np0005604211.cuflqz Feb 1 04:46:12 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:46:12 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Feb 1 04:46:12 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:12.647186) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:46:12 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Feb 1 04:46:12 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939172647257, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1766, "num_deletes": 253, "total_data_size": 7165358, "memory_usage": 7353744, "flush_reason": "Manual Compaction"} Feb 1 04:46:12 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Feb 1 04:46:12 localhost ceph-mon[286721]: Manager daemon np0005604211.cuflqz is now available Feb 1 04:46:12 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch Feb 1 04:46:12 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch Feb 1 04:46:12 localhost systemd-logind[759]: New session 72 of user ceph-admin. Feb 1 04:46:12 localhost systemd[1]: Started Session 72 of User ceph-admin. Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939173139397, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 4271920, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16147, "largest_seqno": 17908, "table_properties": {"data_size": 4264410, "index_size": 4207, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 19590, "raw_average_key_size": 21, "raw_value_size": 4247752, "raw_average_value_size": 4740, "num_data_blocks": 177, "num_entries": 896, "num_filter_entries": 896, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939137, "oldest_key_time": 1769939137, "file_creation_time": 1769939172, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 492267 microseconds, and 8500 cpu microseconds. Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.139456) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 4271920 bytes OK Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.139484) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.141103) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.141130) EVENT_LOG_v1 {"time_micros": 1769939173141123, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.141153) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 7156247, prev total WAL file size 7445306, number of live WAL files 2. Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.142590) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323630' seq:72057594037927935, type:22 .. '6B760031353132' seq:0, type:0; will stop at (end) Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4171KB)], [24(18MB)] Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939173142615, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 23715341, "oldest_snapshot_seqno": -1} Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11111 keys, 22775173 bytes, temperature: kUnknown Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939173232267, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 22775173, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22711166, "index_size": 35106, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 297350, "raw_average_key_size": 26, "raw_value_size": 22520986, "raw_average_value_size": 2026, "num_data_blocks": 1333, "num_entries": 11111, "num_filter_entries": 11111, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939173, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.232569) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 22775173 bytes Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.234367) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 264.2 rd, 253.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 18.5 +0.0 blob) out(21.7 +0.0 blob), read-write-amplify(10.9) write-amplify(5.3) OK, records in: 11623, records dropped: 512 output_compression: NoCompression Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.234384) EVENT_LOG_v1 {"time_micros": 1769939173234377, "job": 12, "event": "compaction_finished", "compaction_time_micros": 89761, "compaction_time_cpu_micros": 42086, "output_level": 6, "num_output_files": 1, "total_output_size": 22775173, "num_input_records": 11623, "num_output_records": 11111, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939173234815, "job": 12, "event": "table_file_deletion", "file_number": 26} Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939173236338, "job": 12, "event": "table_file_deletion", "file_number": 24} Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.142512) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.236369) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.236373) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.236374) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.236375) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:13.236377) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:13 localhost systemd[1]: tmp-crun.t0OXAI.mount: Deactivated successfully. Feb 1 04:46:13 localhost podman[296618]: 2026-02-01 09:46:13.565559216 +0000 UTC m=+0.064346847 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, distribution-scope=public, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-12-08T17:28:53Z, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:46:13 localhost podman[296618]: 2026-02-01 09:46:13.659361547 +0000 UTC m=+0.158149198 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, release=1764794109, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7) Feb 1 04:46:14 localhost ceph-mon[286721]: [01/Feb/2026:09:46:13] ENGINE Bus STARTING Feb 1 04:46:14 localhost ceph-mon[286721]: [01/Feb/2026:09:46:13] ENGINE Serving on http://172.18.0.105:8765 Feb 1 04:46:14 localhost ceph-mon[286721]: [01/Feb/2026:09:46:13] ENGINE Serving on https://172.18.0.105:7150 Feb 1 04:46:14 localhost ceph-mon[286721]: [01/Feb/2026:09:46:13] ENGINE Bus STARTED Feb 1 04:46:14 localhost ceph-mon[286721]: [01/Feb/2026:09:46:13] ENGINE Client ('172.18.0.105', 33394) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.284799) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174284880, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 297, "num_deletes": 251, "total_data_size": 935403, "memory_usage": 953848, "flush_reason": "Manual Compaction"} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174289197, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 622124, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17913, "largest_seqno": 18205, "table_properties": {"data_size": 620159, "index_size": 204, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5516, "raw_average_key_size": 19, "raw_value_size": 616084, "raw_average_value_size": 2200, "num_data_blocks": 10, "num_entries": 280, "num_filter_entries": 280, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939173, "oldest_key_time": 1769939173, "file_creation_time": 1769939174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 4413 microseconds, and 1537 cpu microseconds. Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.289226) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 622124 bytes OK Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.289242) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.290945) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.290957) EVENT_LOG_v1 {"time_micros": 1769939174290953, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.290974) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 933193, prev total WAL file size 933193, number of live WAL files 2. Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.291486) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(607KB)], [27(21MB)] Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174291625, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 23397297, "oldest_snapshot_seqno": -1} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 10876 keys, 19748944 bytes, temperature: kUnknown Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174409618, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 19748944, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19688950, "index_size": 31733, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27205, "raw_key_size": 292945, "raw_average_key_size": 26, "raw_value_size": 19505116, "raw_average_value_size": 1793, "num_data_blocks": 1188, "num_entries": 10876, "num_filter_entries": 10876, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.409940) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 19748944 bytes Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.412592) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 198.1 rd, 167.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 21.7 +0.0 blob) out(18.8 +0.0 blob), read-write-amplify(69.4) write-amplify(31.7) OK, records in: 11391, records dropped: 515 output_compression: NoCompression Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.412622) EVENT_LOG_v1 {"time_micros": 1769939174412609, "job": 14, "event": "compaction_finished", "compaction_time_micros": 118090, "compaction_time_cpu_micros": 50407, "output_level": 6, "num_output_files": 1, "total_output_size": 19748944, "num_input_records": 11391, "num_output_records": 10876, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174413005, "job": 14, "event": "table_file_deletion", "file_number": 29} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174415861, "job": 14, "event": "table_file_deletion", "file_number": 27} Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.291430) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.415895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.415900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.415903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.415906) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:14 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:46:14.415909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:46:15 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:46:15 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:46:15 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:15 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:46:16 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:46:16 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:46:16 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:46:16 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:46:16 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:16 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:16 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:16 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:46:17 localhost podman[297109]: 2026-02-01 09:46:17.193704498 +0000 UTC m=+0.087256540 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:46:17 localhost podman[297109]: 2026-02-01 09:46:17.205348136 +0000 UTC m=+0.098900178 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:46:17 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:46:17 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:17 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:17 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:17 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:17 localhost nova_compute[274651]: 2026-02-01 09:46:17.445 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:46:18 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:19 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:46:20 localhost ceph-mon[286721]: Reconfiguring mon.np0005604211 (monmap changed)... Feb 1 04:46:20 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain Feb 1 04:46:20 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:46:20 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:46:20 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:20 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:20 localhost ceph-mon[286721]: Reconfiguring mon.np0005604213 (monmap changed)... Feb 1 04:46:20 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:46:20 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:46:21 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:22 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:22 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:22 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:46:22 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:22 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:46:22 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:22 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:22 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:46:22 localhost nova_compute[274651]: 2026-02-01 09:46:22.449 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:22 localhost nova_compute[274651]: 2026-02-01 09:46:22.450 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:22 localhost nova_compute[274651]: 2026-02-01 09:46:22.451 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:22 localhost nova_compute[274651]: 2026-02-01 09:46:22.451 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:22 localhost nova_compute[274651]: 2026-02-01 09:46:22.473 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:22 localhost nova_compute[274651]: 2026-02-01 09:46:22.474 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:46:22 localhost podman[297540]: 2026-02-01 09:46:22.727449816 +0000 UTC m=+0.085825397 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true) Feb 1 04:46:22 localhost podman[297540]: 2026-02-01 09:46:22.737832885 +0000 UTC m=+0.096208496 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:46:22 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:46:23 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:46:23 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:46:23 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:23 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:23 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:23 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:23 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:23 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:46:23 localhost podman[236886]: time="2026-02-01T09:46:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:46:23 localhost podman[236886]: @ - - [01/Feb/2026:09:46:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:46:23 localhost podman[236886]: @ - - [01/Feb/2026:09:46:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18794 "" "Go-http-client/1.1" Feb 1 04:46:24 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:46:24 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:46:24 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:24 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:24 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:24 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:24 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:25 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:46:25 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:46:25 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:25 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:25 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:46:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:46:25 localhost podman[297557]: 2026-02-01 09:46:25.724261908 +0000 UTC m=+0.081228406 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:46:25 localhost systemd[1]: tmp-crun.Uycnua.mount: Deactivated successfully. Feb 1 04:46:25 localhost podman[297558]: 2026-02-01 09:46:25.793770733 +0000 UTC m=+0.147089649 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3) Feb 1 04:46:25 localhost podman[297557]: 2026-02-01 09:46:25.857091517 +0000 UTC m=+0.214058005 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:46:25 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:46:25 localhost podman[297558]: 2026-02-01 09:46:25.875373849 +0000 UTC m=+0.228692795 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:46:25 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:46:26 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:46:26 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:46:26 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:26 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:26 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:46:26 localhost ceph-mon[286721]: mon.np0005604212@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:27 localhost ceph-mon[286721]: Reconfiguring mon.np0005604215 (monmap changed)... Feb 1 04:46:27 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain Feb 1 04:46:27 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:27 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:27 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:46:27 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:27 localhost nova_compute[274651]: 2026-02-01 09:46:27.474 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:27 localhost nova_compute[274651]: 2026-02-01 09:46:27.479 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:27 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x55ea794151e0 mon_map magic: 0 from mon.2 v2:172.18.0.103:3300/0 Feb 1 04:46:27 localhost ceph-mon[286721]: mon.np0005604212@2(peon) e13 my rank is now 1 (was 2) Feb 1 04:46:27 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 1 04:46:27 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 1 04:46:27 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x55ea81750000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Feb 1 04:46:27 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:46:27 localhost ceph-mon[286721]: paxos.1).electionLogic(56) init, last seen epoch 56 Feb 1 04:46:27 localhost ceph-mon[286721]: mon.np0005604212@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:46:30 localhost ceph-mon[286721]: mon.np0005604212@1(electing) e13 handle_auth_request failed to assign global_id Feb 1 04:46:30 localhost ceph-mon[286721]: mon.np0005604212@1(electing) e13 handle_auth_request failed to assign global_id Feb 1 04:46:31 localhost ceph-mon[286721]: mon.np0005604212@1(electing) e13 handle_auth_request failed to assign global_id Feb 1 04:46:31 localhost openstack_network_exporter[239441]: ERROR 09:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:46:31 localhost openstack_network_exporter[239441]: Feb 1 04:46:31 localhost openstack_network_exporter[239441]: ERROR 09:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:46:31 localhost openstack_network_exporter[239441]: Feb 1 04:46:32 localhost ceph-mon[286721]: mon.np0005604212@1(electing) e13 handle_auth_request failed to assign global_id Feb 1 04:46:32 localhost nova_compute[274651]: 2026-02-01 09:46:32.481 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:32 localhost nova_compute[274651]: 2026-02-01 09:46:32.512 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:32 localhost nova_compute[274651]: 2026-02-01 09:46:32.512 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:32 localhost nova_compute[274651]: 2026-02-01 09:46:32.512 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:32 localhost nova_compute[274651]: 2026-02-01 09:46:32.513 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:32 localhost nova_compute[274651]: 2026-02-01 09:46:32.513 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:32 localhost nova_compute[274651]: 2026-02-01 09:46:32.514 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:32 localhost ceph-mon[286721]: paxos.1).electionLogic(57) init, last seen epoch 57, mid-election, bumping Feb 1 04:46:32 localhost ceph-mon[286721]: mon.np0005604212@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:46:32 localhost ceph-mon[286721]: mon.np0005604212@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:46:32 localhost ceph-mon[286721]: mon.np0005604212@1(peon) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:46:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:46:33 localhost podman[297694]: 2026-02-01 09:46:33.201889831 +0000 UTC m=+0.084168286 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:46:33 localhost podman[297694]: 2026-02-01 09:46:33.211471406 +0000 UTC m=+0.093749881 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1769056855, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=) Feb 1 04:46:33 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:46:33 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:46:33 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:33 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:33 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:33 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:46:33 localhost ceph-mon[286721]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005604215,np0005604213 Feb 1 04:46:33 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 1 04:46:33 localhost ceph-mon[286721]: stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm Feb 1 04:46:33 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:46:33 localhost ceph-mon[286721]: stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps'] Feb 1 04:46:33 localhost ceph-mon[286721]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005604215,np0005604213 Feb 1 04:46:33 localhost ceph-mon[286721]: mon.np0005604212 (rank 1) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum) Feb 1 04:46:33 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:46:33 localhost ceph-mon[286721]: mon.np0005604215 is new leader, mons np0005604215,np0005604212,np0005604213 in quorum (ranks 0,1,2) Feb 1 04:46:33 localhost ceph-mon[286721]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005604215,np0005604213) Feb 1 04:46:33 localhost ceph-mon[286721]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:46:33 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 1 04:46:33 localhost ceph-mon[286721]: stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm Feb 1 04:46:33 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:46:33 localhost ceph-mon[286721]: stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps'] Feb 1 04:46:33 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:33 localhost ceph-mon[286721]: Removed label mon from host np0005604211.localdomain Feb 1 04:46:35 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:35 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:35 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:35 localhost ceph-mon[286721]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:35 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:36 localhost ceph-mon[286721]: Removed label mgr from host np0005604211.localdomain Feb 1 04:46:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)... Feb 1 04:46:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain Feb 1 04:46:36 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:36 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:36 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:36 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:36 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:36 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:36 localhost podman[298032]: Feb 1 04:46:36 localhost podman[298032]: 2026-02-01 09:46:36.962560424 +0000 UTC m=+0.065008928 container create f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., release=1764794109, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:46:37 localhost systemd[1]: Started libpod-conmon-f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424.scope. Feb 1 04:46:37 localhost systemd[1]: Started libcrun container. Feb 1 04:46:37 localhost podman[298032]: 2026-02-01 09:46:36.930255582 +0000 UTC m=+0.032704096 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:37 localhost podman[298032]: 2026-02-01 09:46:37.045734508 +0000 UTC m=+0.148182982 container init f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:46:37 localhost podman[298032]: 2026-02-01 09:46:37.057226541 +0000 UTC m=+0.159675005 container start f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, GIT_BRANCH=main, name=rhceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, vcs-type=git, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, version=7, maintainer=Guillaume Abrioux , distribution-scope=public) Feb 1 04:46:37 localhost podman[298032]: 2026-02-01 09:46:37.057356815 +0000 UTC m=+0.159805299 container attach f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1764794109, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:46:37 localhost sweet_faraday[298047]: 167 167 Feb 1 04:46:37 localhost podman[298032]: 2026-02-01 09:46:37.061231875 +0000 UTC m=+0.163680339 container died f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, vcs-type=git, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:46:37 localhost systemd[1]: libpod-f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424.scope: Deactivated successfully. Feb 1 04:46:37 localhost ceph-mon[286721]: Reconfiguring crash.np0005604211 (monmap changed)... Feb 1 04:46:37 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain Feb 1 04:46:37 localhost ceph-mon[286721]: Removed label _admin from host np0005604211.localdomain Feb 1 04:46:37 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:37 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:37 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:37 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:37 localhost podman[298052]: 2026-02-01 09:46:37.182824549 +0000 UTC m=+0.105519501 container remove f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_faraday, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, ceph=True) Feb 1 04:46:37 localhost systemd[1]: libpod-conmon-f8c29488469071efc227f2d2eed3108cd7dd96c8e15f21ae383511e0aa9d1424.scope: Deactivated successfully. Feb 1 04:46:37 localhost nova_compute[274651]: 2026-02-01 09:46:37.515 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:37 localhost nova_compute[274651]: 2026-02-01 09:46:37.547 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:37 localhost nova_compute[274651]: 2026-02-01 09:46:37.547 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:37 localhost nova_compute[274651]: 2026-02-01 09:46:37.548 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:37 localhost nova_compute[274651]: 2026-02-01 09:46:37.549 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:37 localhost nova_compute[274651]: 2026-02-01 09:46:37.549 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:37 localhost systemd[1]: tmp-crun.WbKy8r.mount: Deactivated successfully. Feb 1 04:46:37 localhost systemd[1]: var-lib-containers-storage-overlay-cff12c585023fde13f20e9af34a00b15972a850a0d492bfb180a295086e191d8-merged.mount: Deactivated successfully. Feb 1 04:46:38 localhost podman[298122]: Feb 1 04:46:38 localhost podman[298122]: 2026-02-01 09:46:38.036039434 +0000 UTC m=+0.085802177 container create 7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_pike, maintainer=Guillaume Abrioux , distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, vcs-type=git, io.buildah.version=1.41.4, name=rhceph, com.redhat.component=rhceph-container) Feb 1 04:46:38 localhost systemd[1]: Started libpod-conmon-7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4.scope. Feb 1 04:46:38 localhost systemd[1]: Started libcrun container. Feb 1 04:46:38 localhost podman[298122]: 2026-02-01 09:46:38.001829524 +0000 UTC m=+0.051592297 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:38 localhost podman[298122]: 2026-02-01 09:46:38.107602382 +0000 UTC m=+0.157365135 container init 7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_pike, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7) Feb 1 04:46:38 localhost podman[298122]: 2026-02-01 09:46:38.11730156 +0000 UTC m=+0.167064313 container start 7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_pike, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , architecture=x86_64, release=1764794109, com.redhat.component=rhceph-container, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True) Feb 1 04:46:38 localhost podman[298122]: 2026-02-01 09:46:38.117663781 +0000 UTC m=+0.167426524 container attach 7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_pike, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, ceph=True, io.buildah.version=1.41.4, name=rhceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, distribution-scope=public, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109) Feb 1 04:46:38 localhost pedantic_pike[298138]: 167 167 Feb 1 04:46:38 localhost systemd[1]: libpod-7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4.scope: Deactivated successfully. Feb 1 04:46:38 localhost podman[298122]: 2026-02-01 09:46:38.125903354 +0000 UTC m=+0.175666127 container died 7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_pike, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , RELEASE=main, io.buildah.version=1.41.4, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, version=7, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:46:38 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:46:38 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:46:38 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:38 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:38 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:38 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:46:38 localhost podman[298143]: 2026-02-01 09:46:38.226221765 +0000 UTC m=+0.091174811 container remove 7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_pike, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:46:38 localhost systemd[1]: libpod-conmon-7050ec531bcdde2716c717bfcaf2b777593d57869ccce6e5cf8beeb22bd5d3f4.scope: Deactivated successfully. Feb 1 04:46:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:46:38 localhost systemd[1]: var-lib-containers-storage-overlay-7ed5301180fef9f4125c370d9a60a7f58b73497dd545124abd515462511f11e2-merged.mount: Deactivated successfully. Feb 1 04:46:38 localhost systemd[1]: tmp-crun.YWzadx.mount: Deactivated successfully. Feb 1 04:46:38 localhost podman[298202]: 2026-02-01 09:46:38.990093626 +0000 UTC m=+0.098937370 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:46:39 localhost podman[298202]: 2026-02-01 09:46:39.001676071 +0000 UTC m=+0.110519845 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:46:39 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:46:39 localhost podman[298233]: Feb 1 04:46:39 localhost podman[298233]: 2026-02-01 09:46:39.084932959 +0000 UTC m=+0.079307197 container create 0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_leakey, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-type=git, description=Red Hat Ceph Storage 7, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:46:39 localhost systemd[1]: Started libpod-conmon-0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25.scope. Feb 1 04:46:39 localhost podman[298233]: 2026-02-01 09:46:39.052667378 +0000 UTC m=+0.047041646 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:39 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:46:39 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:46:39 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:39 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:39 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:46:39 localhost systemd[1]: Started libcrun container. Feb 1 04:46:39 localhost podman[298233]: 2026-02-01 09:46:39.173101487 +0000 UTC m=+0.167475715 container init 0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_leakey, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, vcs-type=git, build-date=2025-12-08T17:28:53Z, version=7, GIT_CLEAN=True, release=1764794109, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Feb 1 04:46:39 localhost podman[298233]: 2026-02-01 09:46:39.182627809 +0000 UTC m=+0.177002007 container start 0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_leakey, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph) Feb 1 04:46:39 localhost podman[298233]: 2026-02-01 09:46:39.182835586 +0000 UTC m=+0.177209874 container attach 0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_leakey, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.openshift.expose-services=, name=rhceph, version=7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:46:39 localhost interesting_leakey[298249]: 167 167 Feb 1 04:46:39 localhost systemd[1]: libpod-0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25.scope: Deactivated successfully. Feb 1 04:46:39 localhost podman[298233]: 2026-02-01 09:46:39.189303854 +0000 UTC m=+0.183678062 container died 0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_leakey, ceph=True, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public) Feb 1 04:46:39 localhost podman[298254]: 2026-02-01 09:46:39.270116446 +0000 UTC m=+0.070112484 container remove 0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_leakey, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, RELEASE=main, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z) Feb 1 04:46:39 localhost systemd[1]: libpod-conmon-0c76b8326ea465776b6b86d1a751d1e1ade3566fd037f1ea9dd917113b870b25.scope: Deactivated successfully. Feb 1 04:46:39 localhost systemd[1]: var-lib-containers-storage-overlay-94f5c73cf6910d6efa879b8990a29f69dbb6c8b96a06afeadb1bf73bedc0a973-merged.mount: Deactivated successfully. Feb 1 04:46:40 localhost podman[298329]: Feb 1 04:46:40 localhost podman[298329]: 2026-02-01 09:46:40.121692597 +0000 UTC m=+0.076109902 container create e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_shaw, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, release=1764794109, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, vendor=Red Hat, Inc.) Feb 1 04:46:40 localhost systemd[1]: Started libpod-conmon-e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1.scope. Feb 1 04:46:40 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:46:40 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:46:40 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:40 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:40 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:46:40 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:40 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:40 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:46:40 localhost systemd[1]: Started libcrun container. Feb 1 04:46:40 localhost podman[298329]: 2026-02-01 09:46:40.091072555 +0000 UTC m=+0.045489880 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:40 localhost podman[298329]: 2026-02-01 09:46:40.199288473 +0000 UTC m=+0.153705768 container init e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_shaw, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., distribution-scope=public, CEPH_POINT_RELEASE=, release=1764794109, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, build-date=2025-12-08T17:28:53Z) Feb 1 04:46:40 localhost podman[298329]: 2026-02-01 09:46:40.209757785 +0000 UTC m=+0.164175080 container start e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_shaw, com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:46:40 localhost podman[298329]: 2026-02-01 09:46:40.210046434 +0000 UTC m=+0.164463729 container attach e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_shaw, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, RELEASE=main, architecture=x86_64, name=rhceph, build-date=2025-12-08T17:28:53Z) Feb 1 04:46:40 localhost bold_shaw[298344]: 167 167 Feb 1 04:46:40 localhost systemd[1]: libpod-e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1.scope: Deactivated successfully. Feb 1 04:46:40 localhost podman[298329]: 2026-02-01 09:46:40.213735347 +0000 UTC m=+0.168152652 container died e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_shaw, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux ) Feb 1 04:46:40 localhost podman[298349]: 2026-02-01 09:46:40.309601054 +0000 UTC m=+0.086510101 container remove e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_shaw, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1764794109, distribution-scope=public, maintainer=Guillaume Abrioux , version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git) Feb 1 04:46:40 localhost systemd[1]: libpod-conmon-e62c7bcf9844266283da4bc1a556c188bb77cb5c8aec278e369f9cf33ae9fce1.scope: Deactivated successfully. Feb 1 04:46:40 localhost systemd[1]: var-lib-containers-storage-overlay-5960856bfba5fe44d4e14e14cf23233c34c041f3dadd093925af5ea889f72191-merged.mount: Deactivated successfully. Feb 1 04:46:40 localhost podman[298421]: Feb 1 04:46:40 localhost podman[298421]: 2026-02-01 09:46:40.99957356 +0000 UTC m=+0.061274115 container create 0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_lamport, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, distribution-scope=public, release=1764794109, description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True) Feb 1 04:46:41 localhost systemd[1]: Started libpod-conmon-0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1.scope. Feb 1 04:46:41 localhost systemd[1]: Started libcrun container. Feb 1 04:46:41 localhost podman[298421]: 2026-02-01 09:46:41.065305631 +0000 UTC m=+0.127006236 container init 0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_lamport, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, architecture=x86_64, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True) Feb 1 04:46:41 localhost podman[298421]: 2026-02-01 09:46:40.967650028 +0000 UTC m=+0.029350603 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:41 localhost podman[298421]: 2026-02-01 09:46:41.075970949 +0000 UTC m=+0.137671494 container start 0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_lamport, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, ceph=True, version=7, name=rhceph, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container) Feb 1 04:46:41 localhost podman[298421]: 2026-02-01 09:46:41.076491555 +0000 UTC m=+0.138192110 container attach 0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_lamport, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, release=1764794109, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:46:41 localhost infallible_lamport[298436]: 167 167 Feb 1 04:46:41 localhost systemd[1]: libpod-0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1.scope: Deactivated successfully. Feb 1 04:46:41 localhost podman[298421]: 2026-02-01 09:46:41.080306182 +0000 UTC m=+0.142006767 container died 0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_lamport, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z) Feb 1 04:46:41 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:41 localhost podman[298441]: 2026-02-01 09:46:41.180856534 +0000 UTC m=+0.087007476 container remove 0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_lamport, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:46:41 localhost systemd[1]: libpod-conmon-0cf2a7d1735c62c365279d4655e18fa69745d5117af0a42a6559547319a7f3c1.scope: Deactivated successfully. Feb 1 04:46:41 localhost nova_compute[274651]: 2026-02-01 09:46:41.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:41 localhost nova_compute[274651]: 2026-02-01 09:46:41.273 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:46:41 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:41 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:41 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:46:41 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:41 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:41 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:46:41 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:41 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:41 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:46:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:46:41.710 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:46:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:46:41.710 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:46:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:46:41.711 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:46:41 localhost podman[298511]: Feb 1 04:46:41 localhost podman[298511]: 2026-02-01 09:46:41.837437343 +0000 UTC m=+0.079612009 container create 1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_diffie, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, vcs-type=git, name=rhceph, distribution-scope=public, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main) Feb 1 04:46:41 localhost systemd[1]: Started libpod-conmon-1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84.scope. Feb 1 04:46:41 localhost systemd[1]: Started libcrun container. Feb 1 04:46:41 localhost podman[298511]: 2026-02-01 09:46:41.90404731 +0000 UTC m=+0.146221976 container init 1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_diffie, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, ceph=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container) Feb 1 04:46:41 localhost podman[298511]: 2026-02-01 09:46:41.807043028 +0000 UTC m=+0.049217764 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:46:41 localhost podman[298511]: 2026-02-01 09:46:41.913016637 +0000 UTC m=+0.155191323 container start 1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_diffie, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:46:41 localhost podman[298511]: 2026-02-01 09:46:41.913300876 +0000 UTC m=+0.155475572 container attach 1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_diffie, io.openshift.expose-services=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, ceph=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph) Feb 1 04:46:41 localhost happy_diffie[298527]: 167 167 Feb 1 04:46:41 localhost systemd[1]: libpod-1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84.scope: Deactivated successfully. Feb 1 04:46:41 localhost podman[298511]: 2026-02-01 09:46:41.91508577 +0000 UTC m=+0.157260436 container died 1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_diffie, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:46:41 localhost systemd[1]: var-lib-containers-storage-overlay-fe6457ff03eaa1b296b8388c591b770a6cc2d301dd0ae9f23025188642671af8-merged.mount: Deactivated successfully. Feb 1 04:46:41 localhost systemd[1]: tmp-crun.Geh805.mount: Deactivated successfully. Feb 1 04:46:41 localhost systemd[1]: var-lib-containers-storage-overlay-6c1d8b998425f4dfaf78e7b662f4a7701cb3c702fb317693ba7ae43718388378-merged.mount: Deactivated successfully. Feb 1 04:46:42 localhost podman[298532]: 2026-02-01 09:46:42.0054705 +0000 UTC m=+0.078025700 container remove 1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_diffie, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, release=1764794109, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7) Feb 1 04:46:42 localhost systemd[1]: libpod-conmon-1fca071c68b50327600b21aba3207d2707f741f221345695c706ef9c57baaa84.scope: Deactivated successfully. Feb 1 04:46:42 localhost nova_compute[274651]: 2026-02-01 09:46:42.550 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:42 localhost nova_compute[274651]: 2026-02-01 09:46:42.552 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:42 localhost nova_compute[274651]: 2026-02-01 09:46:42.553 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:42 localhost nova_compute[274651]: 2026-02-01 09:46:42.553 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:42 localhost nova_compute[274651]: 2026-02-01 09:46:42.575 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:42 localhost nova_compute[274651]: 2026-02-01 09:46:42.576 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:43 localhost ceph-mon[286721]: Reconfiguring mon.np0005604212 (monmap changed)... Feb 1 04:46:43 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:46:43 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:43 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:43 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:46:43 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:43 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:43 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:46:43 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:43 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:43 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:46:43 localhost nova_compute[274651]: 2026-02-01 09:46:43.272 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:44 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:46:44 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:46:44 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:44 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:44 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.292 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.293 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.293 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.293 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.294 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:46:44 localhost ceph-mon[286721]: mon.np0005604212@1(peon) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:46:44 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3190989409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.736 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.858 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:46:44 localhost nova_compute[274651]: 2026-02-01 09:46:44.859 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.111 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.113 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11507MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.114 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.115 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:46:45 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:46:45 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:46:45 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:45 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:45 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:46:45 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:45 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:45 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.205 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.206 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.207 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.247 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:46:45 localhost ceph-mon[286721]: mon.np0005604212@1(peon) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:46:45 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/992751241' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.710 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.717 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.761 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.764 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:46:45 localhost nova_compute[274651]: 2026-02-01 09:46:45.765 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.650s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:46:46 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:46 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:46 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:46 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:46:46 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:46 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:46 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:46:46 localhost nova_compute[274651]: 2026-02-01 09:46:46.762 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:46 localhost nova_compute[274651]: 2026-02-01 09:46:46.762 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:47 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:47 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:47 localhost ceph-mon[286721]: Reconfiguring mon.np0005604213 (monmap changed)... Feb 1 04:46:47 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:46:47 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.576 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.578 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.578 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.578 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.619 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.620 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:46:47 localhost podman[298592]: 2026-02-01 09:46:47.691733522 +0000 UTC m=+0.055804106 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:46:47 localhost podman[298592]: 2026-02-01 09:46:47.70436512 +0000 UTC m=+0.068435704 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:46:47 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.723 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.724 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.724 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:46:47 localhost nova_compute[274651]: 2026-02-01 09:46:47.724 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:46:48 localhost nova_compute[274651]: 2026-02-01 09:46:48.134 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:46:48 localhost nova_compute[274651]: 2026-02-01 09:46:48.150 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:46:48 localhost nova_compute[274651]: 2026-02-01 09:46:48.151 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:46:48 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:48 localhost ceph-mon[286721]: Added label _no_schedule to host np0005604211.localdomain Feb 1 04:46:48 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:48 localhost ceph-mon[286721]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604211.localdomain Feb 1 04:46:48 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:48 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:48 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:46:48 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:48 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:46:48 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:46:49 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:49 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:49 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:46:49 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:46:49 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:46:49 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:49 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:49 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:46:50 localhost nova_compute[274651]: 2026-02-01 09:46:50.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:50 localhost nova_compute[274651]: 2026-02-01 09:46:50.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:46:50 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:46:50 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} : dispatch Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} : dispatch Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"}]': finished Feb 1 04:46:50 localhost ceph-mon[286721]: Removed host np0005604211.localdomain Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:50 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:46:51 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:51 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:46:51 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:46:51 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:51 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' Feb 1 04:46:51 localhost ceph-mon[286721]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:51 localhost ceph-mon[286721]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:46:52 localhost nova_compute[274651]: 2026-02-01 09:46:52.620 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:52 localhost nova_compute[274651]: 2026-02-01 09:46:52.622 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:46:52 localhost nova_compute[274651]: 2026-02-01 09:46:52.622 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:46:52 localhost nova_compute[274651]: 2026-02-01 09:46:52.623 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:52 localhost nova_compute[274651]: 2026-02-01 09:46:52.653 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:52 localhost nova_compute[274651]: 2026-02-01 09:46:52.654 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:46:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:46:53 localhost podman[298614]: 2026-02-01 09:46:53.719564517 +0000 UTC m=+0.079401952 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Feb 1 04:46:53 localhost podman[298614]: 2026-02-01 09:46:53.74923409 +0000 UTC m=+0.109071515 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:46:53 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:46:53 localhost podman[236886]: time="2026-02-01T09:46:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:46:53 localhost podman[236886]: @ - - [01/Feb/2026:09:46:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:46:54 localhost podman[236886]: @ - - [01/Feb/2026:09:46:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18805 "" "Go-http-client/1.1" Feb 1 04:46:56 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:46:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:46:56 localhost podman[298632]: 2026-02-01 09:46:56.723198684 +0000 UTC m=+0.086071167 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:46:56 localhost podman[298632]: 2026-02-01 09:46:56.73446363 +0000 UTC m=+0.097336123 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:46:56 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:46:56 localhost systemd[1]: tmp-crun.vRzQkZ.mount: Deactivated successfully. Feb 1 04:46:56 localhost podman[298633]: 2026-02-01 09:46:56.826853061 +0000 UTC m=+0.187446505 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:46:56 localhost podman[298633]: 2026-02-01 09:46:56.890508528 +0000 UTC m=+0.251101972 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 04:46:56 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:46:57 localhost nova_compute[274651]: 2026-02-01 09:46:57.654 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:46:57 localhost nova_compute[274651]: 2026-02-01 09:46:57.656 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:01 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:01 localhost openstack_network_exporter[239441]: ERROR 09:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:47:01 localhost openstack_network_exporter[239441]: Feb 1 04:47:01 localhost openstack_network_exporter[239441]: ERROR 09:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:47:01 localhost openstack_network_exporter[239441]: Feb 1 04:47:02 localhost nova_compute[274651]: 2026-02-01 09:47:02.656 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:02 localhost nova_compute[274651]: 2026-02-01 09:47:02.658 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:02 localhost nova_compute[274651]: 2026-02-01 09:47:02.659 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:02 localhost nova_compute[274651]: 2026-02-01 09:47:02.659 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:02 localhost nova_compute[274651]: 2026-02-01 09:47:02.698 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:02 localhost nova_compute[274651]: 2026-02-01 09:47:02.698 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.528 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.533 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a586800-d8f6-4311-829b-c84a6a64b0af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.529411', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f66ed3c8-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': 'f0d7d92463412bc06e7ade128e69df781dc99fb5cf03835c73fe20b1085f37b8'}]}, 'timestamp': '2026-02-01 09:47:03.534782', '_unique_id': 'ef6172a6454847bd8345881a71673732'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.536 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.537 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.538 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a26ac92-9ee4-4a90-bb6a-6a4bce6004e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.538236', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f66f727e-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': '59f34c09b7472db4b028e25b5c8bd7e1befb3b376e71f9c6d9dd015a744b1683'}]}, 'timestamp': '2026-02-01 09:47:03.538800', '_unique_id': 'a48d218b5c864c59a9b878102c758aa4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.539 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.541 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.573 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.573 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6867ae4b-27d3-4010-b19e-4a8f6a21aa83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.541599', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f674c7ba-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '14a8e8d0a36102da8129b66124c5b3c860ec88fa932fd583ffdf1ca7f5051370'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.541599', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f674d9d0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': 'a2c92c2fd5ee32167476bbe66fe726ce6950e4d1126e64a7dea1368b36134f64'}]}, 'timestamp': '2026-02-01 09:47:03.574228', '_unique_id': '877ac3288b66425aa438eb6fa57a0d26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.575 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.577 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.594 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 13050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86710b37-8bc5-4d38-8ccf-c29b03f76921', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13050000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:47:03.577398', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f6780682-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.813702496, 'message_signature': '39f0474f2f0476ba9dc2a78fc2730d54d319690612d7862d2345e365c5ffa328'}]}, 'timestamp': '2026-02-01 09:47:03.595140', '_unique_id': '7157bd4cc3ee4cf4a00b88b26344f252'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.609 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b7d4157-df41-43e7-9825-5959f2fa1fd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.598372', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f67a62a6-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.817850914, 'message_signature': 'e001af1928cae6d26a0aba6c4f7af83960faee95eafef7a732e5b9a28479cdfc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.598372', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f67a7732-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.817850914, 'message_signature': '74508de1ca799ba7f9729fda49f5b7e72ae01f091f76e2f856f442fc937f4855'}]}, 'timestamp': '2026-02-01 09:47:03.610977', '_unique_id': '588324ce85de4393b21ed8a91a8920bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.614 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97be2a69-7566-4df0-9d41-8269bbd82237', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.614266', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f67b10ca-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': 'f5a58957239f31dbf636563753ea36bee8fa1b45456057bebf5dd044be23746e'}]}, 'timestamp': '2026-02-01 09:47:03.614938', '_unique_id': 'a62275c4d05f4e0282c255e6670a06b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.619 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.619 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.619 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a005354f-02a8-48c3-bf03-e738b023a087', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.619327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f67bd032-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '5ded1cc6563657028adae0af36f1718288864d04b0dc9aa31297c71dab5a9d24'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.619327', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f67bdf3c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': 'e69fc7b96e4a6e597a333d657759142741016234699e5a9c9baff2e2355d14d6'}]}, 'timestamp': '2026-02-01 09:47:03.620158', '_unique_id': '0fda19e7498a4be5a87aad2f4ee9f6e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.622 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.623 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d42ad6f-c7ff-4db4-b571-a483b6319e8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.622746', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f67c537c-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.817850914, 'message_signature': '2afa64c08422904c3044a14d84911951ad5be7fa8307ee7b1198bede8e3419e7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.622746', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f67c6128-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.817850914, 'message_signature': '02a51865a050cad83ac9d2c0a9b56c4662459dec3487af4f3f26313b41192d14'}]}, 'timestamp': '2026-02-01 09:47:03.623451', '_unique_id': 'bf5e344eef60444db7af0c2831ea1034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.625 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.625 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.625 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a2f5565-e0bb-49c8-adfa-4707d3acda90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.625296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f67cb768-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': 'ece12348ad9ebbb1c1fab5b11312faa0e8cb8d0867fc14ba58fc5eebb1ec5198'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.625296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f67cc3c0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '05588ce2c80c9e85c724435a858ebe87c78980707df39779f2985441e09215b3'}]}, 'timestamp': '2026-02-01 09:47:03.625962', '_unique_id': '2382129c97df4515a394e6713a0b127e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.628 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78ffadf1-e86d-489f-8345-d5589b36431c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.628199', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f67d2c66-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': 'ec69282db3c4e96fe65310a4306b4f12479e26a787c62b08c39553ff81ad6891'}]}, 'timestamp': '2026-02-01 09:47:03.628729', '_unique_id': '59321a8a00284a86838ee8955bbf1613'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.631 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.631 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.631 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d51ec01-1d9b-4745-8f95-aea5763330a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.631264', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f67da0e2-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': '00be97ef2e8c0457e1e1bf5343151ddb09dc5a81b7b8951e5e97a616e3c6674a'}]}, 'timestamp': '2026-02-01 09:47:03.631643', '_unique_id': '6ac88549ed774b71b263ca2c084a3735'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.633 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.633 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43555d0e-5272-48d8-b6f5-74b02f2b22b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.633413', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f67df7b8-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': '643f67ddec21884c925e73c04f7c8bf6c55a9f92671f29fb5f547ec6ded8d577'}]}, 'timestamp': '2026-02-01 09:47:03.633917', '_unique_id': 'cdfe554f63f84b478d7a08921d8e1700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9094ab2-9c2c-4c4e-ae4e-edef834efce9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.635865', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f67e5438-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '128a8560e9eabd08521e37f3c559fd55c2f5d6d53c34d5677480577a19f19794'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.635865', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f67e604a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '2081d481edeb0ed405629c5e9784e9bbbf9e5566f240980af4c6efb5fa91cdfc'}]}, 'timestamp': '2026-02-01 09:47:03.636519', '_unique_id': '51913ba911914d3b819e1645da1f6c1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2684aa5c-a330-4915-8ec9-f30b5019f5a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.638864', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f67ec9b8-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.817850914, 'message_signature': 'c91bf42f04f720b1efd773523f167bea178c4f2acfc8036a8a952e839aeaa9e9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.638864', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f67ed6f6-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.817850914, 'message_signature': 'bb154749967da894a2ecda790527f39eedbaeecb5107fee8b41978422e5eea19'}]}, 'timestamp': '2026-02-01 09:47:03.639564', '_unique_id': '9e517769863549cd9c0501c866b57246'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.641 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79bfdba1-2003-4b44-8e07-492bfbf37651', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.641398', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f67f2c14-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': '9d8dbd4abddb9d95f2fe6a7a3fa41d3aeb47fce2e1d1a815c7adc019b5f5d733'}]}, 'timestamp': '2026-02-01 09:47:03.641757', '_unique_id': 'b19708534206432d82eead07d91db279'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.643 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.643 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.643 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.643 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b58d3926-e07a-4b97-adba-2950484afaa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.643774', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f67f8a6a-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': '094a6d68164c482c361e8e1936e21a8612e38cc1990e31e55af6c9565d5a4aca'}]}, 'timestamp': '2026-02-01 09:47:03.644267', '_unique_id': '95bd38f743d34f11bcdd7045f89ca8a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.646 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.646 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.646 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b28fdb7-1ee0-4cf4-b96d-6d62d061146f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.646273', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f67feb04-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '34138cdea7a3f1024d8a0c9e3ba9214f954b24fc0bbd7eb5edaf0f33c9b946d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.646273', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f67ffae0-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': 'f11e0534ed988d9f65492202df6edbff2d181a1eaf6b5287f3f7ea373184bbf6'}]}, 'timestamp': '2026-02-01 09:47:03.647085', '_unique_id': 'a461b143bb904af389eb106f5384608e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.648 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.648 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.648 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1ef312d-296e-4492-a613-8e548ab25ff9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:47:03.648599', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f68043d8-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '30c29ea5eaa8d475ba624f0ccc4567a74cf47a8cfddadce29b52760eae41bfee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:47:03.648599', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f6804f54-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.761055297, 'message_signature': '43522d166bdc3ee05c3decc22b1b1845d790c30d729b5a39b0ab90da5aca051a'}]}, 'timestamp': '2026-02-01 09:47:03.649193', '_unique_id': '9729452ff5534fd6b83ff713acecb113'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.651 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ddd8403-8c8e-4777-935b-7a956e2b8f5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:47:03.650969', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f680a3dc-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.813702496, 'message_signature': '8d85f8c5e48d948488201607ac19f4544db1a69dfad60b39ec69bfae6f8afbd7'}]}, 'timestamp': '2026-02-01 09:47:03.651438', '_unique_id': 'ad2398c6a0b74370a2827074e7d5c06a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.652 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0f55a29-7a5d-4448-a95f-74bacdf8595c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.653017', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f680eff4-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': '2e61c5e84d240bc713590b7c4dcba54fd22b5d73460e210463cf1ea5d4c9ca74'}]}, 'timestamp': '2026-02-01 09:47:03.653342', '_unique_id': 'e6ac42a429a74a339ee12d1927835fc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.654 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.654 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df86d025-7f07-4b08-bb05-8349ffd192fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:47:03.654710', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'f68131c6-ff52-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11317.748845112, 'message_signature': '4a39395760a97bdf67bb1c449c099d9a946a0111c76399a483028be663c8d099'}]}, 'timestamp': '2026-02-01 09:47:03.655029', '_unique_id': '25c37710c1d6483c840ddca7f7d5e77f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:47:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:47:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:47:03 localhost podman[298680]: 2026-02-01 09:47:03.731656232 +0000 UTC m=+0.088485302 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, vendor=Red Hat, Inc.) Feb 1 04:47:03 localhost podman[298680]: 2026-02-01 09:47:03.745368283 +0000 UTC m=+0.102197353 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public) Feb 1 04:47:03 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:47:06 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:07 localhost nova_compute[274651]: 2026-02-01 09:47:07.699 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:07 localhost nova_compute[274651]: 2026-02-01 09:47:07.701 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:07 localhost nova_compute[274651]: 2026-02-01 09:47:07.701 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:07 localhost nova_compute[274651]: 2026-02-01 09:47:07.702 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:07 localhost nova_compute[274651]: 2026-02-01 09:47:07.730 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:07 localhost nova_compute[274651]: 2026-02-01 09:47:07.731 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:47:09 localhost podman[298701]: 2026-02-01 09:47:09.708678216 +0000 UTC m=+0.070735656 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 04:47:09 localhost podman[298701]: 2026-02-01 09:47:09.749672476 +0000 UTC m=+0.111729926 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 1 04:47:09 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:47:11 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:12 localhost nova_compute[274651]: 2026-02-01 09:47:12.732 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:12 localhost nova_compute[274651]: 2026-02-01 09:47:12.734 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:12 localhost nova_compute[274651]: 2026-02-01 09:47:12.735 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:12 localhost nova_compute[274651]: 2026-02-01 09:47:12.735 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:12 localhost nova_compute[274651]: 2026-02-01 09:47:12.775 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:12 localhost nova_compute[274651]: 2026-02-01 09:47:12.776 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:16 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:17 localhost nova_compute[274651]: 2026-02-01 09:47:17.777 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:17 localhost nova_compute[274651]: 2026-02-01 09:47:17.778 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:17 localhost nova_compute[274651]: 2026-02-01 09:47:17.779 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:17 localhost nova_compute[274651]: 2026-02-01 09:47:17.779 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:17 localhost nova_compute[274651]: 2026-02-01 09:47:17.780 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:17 localhost nova_compute[274651]: 2026-02-01 09:47:17.783 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:47:18 localhost systemd[1]: tmp-crun.bIcXA9.mount: Deactivated successfully. Feb 1 04:47:18 localhost podman[298720]: 2026-02-01 09:47:18.73338333 +0000 UTC m=+0.090584676 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:47:18 localhost podman[298720]: 2026-02-01 09:47:18.7415387 +0000 UTC m=+0.098740046 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:47:18 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:47:21 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:22 localhost nova_compute[274651]: 2026-02-01 09:47:22.784 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:22 localhost nova_compute[274651]: 2026-02-01 09:47:22.785 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:22 localhost nova_compute[274651]: 2026-02-01 09:47:22.785 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:22 localhost nova_compute[274651]: 2026-02-01 09:47:22.785 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:22 localhost nova_compute[274651]: 2026-02-01 09:47:22.806 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:22 localhost nova_compute[274651]: 2026-02-01 09:47:22.807 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:23 localhost podman[236886]: time="2026-02-01T09:47:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:47:23 localhost podman[236886]: @ - - [01/Feb/2026:09:47:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:47:24 localhost podman[236886]: @ - - [01/Feb/2026:09:47:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18799 "" "Go-http-client/1.1" Feb 1 04:47:24 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e89 e89: 6 total, 6 up, 6 in Feb 1 04:47:24 localhost ceph-mon[286721]: Activating manager daemon np0005604209.isqrps Feb 1 04:47:24 localhost ceph-mon[286721]: Manager daemon np0005604211.cuflqz is unresponsive, replacing it with standby daemon np0005604209.isqrps Feb 1 04:47:24 localhost sshd[298742]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:47:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:47:24 localhost podman[298744]: 2026-02-01 09:47:24.714118087 +0000 UTC m=+0.065604728 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:47:24 localhost podman[298744]: 2026-02-01 09:47:24.724455815 +0000 UTC m=+0.075942476 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:47:24 localhost systemd-logind[759]: New session 73 of user ceph-admin. Feb 1 04:47:24 localhost systemd[1]: Started Session 73 of User ceph-admin. Feb 1 04:47:24 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:47:25 localhost ceph-mon[286721]: Manager daemon np0005604209.isqrps is now available Feb 1 04:47:25 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch Feb 1 04:47:25 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished Feb 1 04:47:25 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch Feb 1 04:47:25 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished Feb 1 04:47:25 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch Feb 1 04:47:25 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch Feb 1 04:47:25 localhost podman[298874]: 2026-02-01 09:47:25.81769971 +0000 UTC m=+0.097519939 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , release=1764794109, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:47:25 localhost podman[298874]: 2026-02-01 09:47:25.939765004 +0000 UTC m=+0.219585233 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.41.4, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Feb 1 04:47:26 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:26 localhost ceph-mon[286721]: removing stray HostCache host record np0005604211.localdomain.devices.0 Feb 1 04:47:26 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:47:26 localhost systemd[1]: tmp-crun.nk46kR.mount: Deactivated successfully. Feb 1 04:47:26 localhost podman[299032]: 2026-02-01 09:47:26.897996968 +0000 UTC m=+0.097550801 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:47:26 localhost podman[299032]: 2026-02-01 09:47:26.91139686 +0000 UTC m=+0.110950763 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:47:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:47:26 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:47:27 localhost podman[299055]: 2026-02-01 09:47:27.018253566 +0000 UTC m=+0.081358603 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:47:27 localhost podman[299055]: 2026-02-01 09:47:27.079391405 +0000 UTC m=+0.142496422 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:47:27 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:47:27 localhost ceph-mon[286721]: Saving service mon spec with placement label:mon Feb 1 04:47:27 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:47:27 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:47:27 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:47:27 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:27 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:27 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:27 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:27 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:27 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:27 localhost ceph-mon[286721]: [01/Feb/2026:09:47:27] ENGINE Bus STARTING Feb 1 04:47:27 localhost ceph-mon[286721]: [01/Feb/2026:09:47:27] ENGINE Serving on http://172.18.0.200:8765 Feb 1 04:47:27 localhost ceph-mon[286721]: [01/Feb/2026:09:47:27] ENGINE Serving on https://172.18.0.200:7150 Feb 1 04:47:27 localhost ceph-mon[286721]: [01/Feb/2026:09:47:27] ENGINE Bus STARTED Feb 1 04:47:27 localhost ceph-mon[286721]: [01/Feb/2026:09:47:27] ENGINE Client ('172.18.0.200', 32790) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:47:27 localhost nova_compute[274651]: 2026-02-01 09:47:27.808 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:27 localhost nova_compute[274651]: 2026-02-01 09:47:27.813 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:27 localhost nova_compute[274651]: 2026-02-01 09:47:27.813 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:27 localhost nova_compute[274651]: 2026-02-01 09:47:27.813 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:27 localhost nova_compute[274651]: 2026-02-01 09:47:27.845 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:27 localhost nova_compute[274651]: 2026-02-01 09:47:27.846 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:47:28 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:47:28 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:47:28 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:47:28 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:47:28 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:47:28 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:47:28 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:47:28 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:28 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:28 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:30 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:30 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:30 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:31 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x55ea794151e0 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0 Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@1(peon) e14 my rank is now 0 (was 1) Feb 1 04:47:31 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 1 04:47:31 localhost ceph-mgr[278591]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 1 04:47:31 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x55ea7dea2000 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0) Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0) Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:47:31 localhost ceph-mon[286721]: paxos.0).electionLogic(62) init, last seen epoch 62 Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 is new leader, mons np0005604212,np0005604213 in quorum (ranks 0,1) Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : monmap epoch 14 Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : last_changed 2026-02-01T09:47:31.128772+0000 Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : created 2026-02-01T07:37:52.883666+0000 Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212 Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213 Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e47: np0005604209.isqrps(active, since 6s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : overall HEALTH_OK Feb 1 04:47:31 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:47:31 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:47:31 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:47:31 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:47:31 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:47:31 localhost ceph-mon[286721]: Remove daemons mon.np0005604215 Feb 1 04:47:31 localhost ceph-mon[286721]: Safe to remove mon.np0005604215: new quorum should be ['np0005604212', 'np0005604213'] (from ['np0005604212', 'np0005604213']) Feb 1 04:47:31 localhost ceph-mon[286721]: Removing monitor np0005604215 from monmap... Feb 1 04:47:31 localhost ceph-mon[286721]: Removing daemon mon.np0005604215 from np0005604215.localdomain -- ports [] Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212 is new leader, mons np0005604212,np0005604213 in quorum (ranks 0,1) Feb 1 04:47:31 localhost ceph-mon[286721]: overall HEALTH_OK Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:47:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:31 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:31 localhost openstack_network_exporter[239441]: ERROR 09:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:47:31 localhost openstack_network_exporter[239441]: Feb 1 04:47:31 localhost openstack_network_exporter[239441]: ERROR 09:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:47:31 localhost openstack_network_exporter[239441]: Feb 1 04:47:31 localhost podman[299878]: Feb 1 04:47:31 localhost podman[299878]: 2026-02-01 09:47:31.980438694 +0000 UTC m=+0.071256281 container create 528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_wu, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, architecture=x86_64, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_BRANCH=main) Feb 1 04:47:32 localhost systemd[1]: Started libpod-conmon-528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef.scope. Feb 1 04:47:32 localhost systemd[1]: Started libcrun container. Feb 1 04:47:32 localhost podman[299878]: 2026-02-01 09:47:32.052568732 +0000 UTC m=+0.143386359 container init 528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_wu, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, name=rhceph) Feb 1 04:47:32 localhost podman[299878]: 2026-02-01 09:47:31.953860327 +0000 UTC m=+0.044677984 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:32 localhost podman[299878]: 2026-02-01 09:47:32.062409314 +0000 UTC m=+0.153226941 container start 528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_wu, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, io.buildah.version=1.41.4, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z) Feb 1 04:47:32 localhost podman[299878]: 2026-02-01 09:47:32.063140098 +0000 UTC m=+0.153957785 container attach 528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_wu, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:47:32 localhost admiring_wu[299893]: 167 167 Feb 1 04:47:32 localhost systemd[1]: libpod-528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef.scope: Deactivated successfully. Feb 1 04:47:32 localhost podman[299878]: 2026-02-01 09:47:32.072270988 +0000 UTC m=+0.163088665 container died 528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_wu, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7) Feb 1 04:47:32 localhost podman[299899]: 2026-02-01 09:47:32.160353766 +0000 UTC m=+0.075099150 container remove 528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_wu, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4) Feb 1 04:47:32 localhost systemd[1]: libpod-conmon-528e731807a883c32042f3c9217ee8e8851340ad9f13ac22ab574118dcad62ef.scope: Deactivated successfully. Feb 1 04:47:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Feb 1 04:47:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:47:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:32 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:32 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:47:32 localhost ceph-mon[286721]: Reconfiguring crash.np0005604212 (monmap changed)... Feb 1 04:47:32 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain Feb 1 04:47:32 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:32 localhost nova_compute[274651]: 2026-02-01 09:47:32.847 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:32 localhost nova_compute[274651]: 2026-02-01 09:47:32.850 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:32 localhost nova_compute[274651]: 2026-02-01 09:47:32.850 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:32 localhost nova_compute[274651]: 2026-02-01 09:47:32.850 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:32 localhost nova_compute[274651]: 2026-02-01 09:47:32.907 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:32 localhost nova_compute[274651]: 2026-02-01 09:47:32.908 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:32 localhost podman[299970]: Feb 1 04:47:32 localhost podman[299970]: 2026-02-01 09:47:32.939671949 +0000 UTC m=+0.136749045 container create bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_nightingale, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64) Feb 1 04:47:32 localhost systemd[1]: Started libpod-conmon-bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257.scope. Feb 1 04:47:32 localhost systemd[1]: var-lib-containers-storage-overlay-2e947dfb718d55dfd8690b8b6771800a3de1f253144ef952c0cfe54af17f0efc-merged.mount: Deactivated successfully. Feb 1 04:47:33 localhost podman[299970]: 2026-02-01 09:47:32.907359986 +0000 UTC m=+0.104437112 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:33 localhost systemd[1]: Started libcrun container. Feb 1 04:47:33 localhost podman[299970]: 2026-02-01 09:47:33.028647515 +0000 UTC m=+0.225724611 container init bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_nightingale, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:47:33 localhost podman[299970]: 2026-02-01 09:47:33.03889683 +0000 UTC m=+0.235973926 container start bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_nightingale, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, release=1764794109, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public) Feb 1 04:47:33 localhost podman[299970]: 2026-02-01 09:47:33.039244931 +0000 UTC m=+0.236322027 container attach bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_nightingale, vcs-type=git, version=7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, RELEASE=main, build-date=2025-12-08T17:28:53Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 1 04:47:33 localhost exciting_nightingale[299986]: 167 167 Feb 1 04:47:33 localhost systemd[1]: libpod-bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257.scope: Deactivated successfully. Feb 1 04:47:33 localhost podman[299970]: 2026-02-01 09:47:33.044794821 +0000 UTC m=+0.241871937 container died bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_nightingale, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1764794109, GIT_BRANCH=main) Feb 1 04:47:33 localhost systemd[1]: var-lib-containers-storage-overlay-d503ec82e8dfe07229a5f534e3e5a0ae9ed52bbf9478528216fc2b4703ac8bff-merged.mount: Deactivated successfully. Feb 1 04:47:33 localhost podman[299991]: 2026-02-01 09:47:33.14229705 +0000 UTC m=+0.085710007 container remove bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_nightingale, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7) Feb 1 04:47:33 localhost systemd[1]: libpod-conmon-bd7d7dfb1d57cc8e3581c3999a3e4f5a2702ba647a32a9d4c8aab36245fec257.scope: Deactivated successfully. Feb 1 04:47:33 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:47:33 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:47:33 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:33 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:47:33 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:47:33 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:47:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:33 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:33 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Feb 1 04:47:33 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:47:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:33 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:47:33 localhost podman[300066]: 2026-02-01 09:47:33.97066866 +0000 UTC m=+0.075256284 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, vcs-type=git, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 04:47:33 localhost podman[300066]: 2026-02-01 09:47:33.986739564 +0000 UTC m=+0.091327268 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, version=9.7, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:47:34 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:47:34 localhost podman[300074]: Feb 1 04:47:34 localhost podman[300074]: 2026-02-01 09:47:34.071731778 +0000 UTC m=+0.162215189 container create d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_mccarthy, release=1764794109, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , version=7, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:47:34 localhost systemd[1]: Started libpod-conmon-d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36.scope. Feb 1 04:47:34 localhost systemd[1]: Started libcrun container. Feb 1 04:47:34 localhost podman[300074]: 2026-02-01 09:47:34.126892054 +0000 UTC m=+0.217375505 container init d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_mccarthy, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , RELEASE=main) Feb 1 04:47:34 localhost podman[300074]: 2026-02-01 09:47:34.049818734 +0000 UTC m=+0.140302175 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:34 localhost podman[300074]: 2026-02-01 09:47:34.171879097 +0000 UTC m=+0.262362528 container start d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_mccarthy, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.41.4, version=7, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux ) Feb 1 04:47:34 localhost podman[300074]: 2026-02-01 09:47:34.172366642 +0000 UTC m=+0.262850143 container attach d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_mccarthy, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, release=1764794109, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:47:34 localhost objective_mccarthy[300104]: 167 167 Feb 1 04:47:34 localhost systemd[1]: libpod-d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36.scope: Deactivated successfully. Feb 1 04:47:34 localhost podman[300074]: 2026-02-01 09:47:34.174977682 +0000 UTC m=+0.265461123 container died d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_mccarthy, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, ceph=True, release=1764794109, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container) Feb 1 04:47:34 localhost podman[300109]: 2026-02-01 09:47:34.286222893 +0000 UTC m=+0.099598243 container remove d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_mccarthy, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7) Feb 1 04:47:34 localhost systemd[1]: libpod-conmon-d6f63adbef8d51f8ceffa71344459a670c8c7ded927f7a48fc7c54d017584a36.scope: Deactivated successfully. Feb 1 04:47:34 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:47:34 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:47:34 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:34 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:34 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:47:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:47:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:34 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:34 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:34 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 1 04:47:34 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:47:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:34 localhost systemd[1]: var-lib-containers-storage-overlay-1b2d9bd8238ad07bc74bba68a93d18d9d4208bb7b53d528f8c4383dd75b002d3-merged.mount: Deactivated successfully. Feb 1 04:47:35 localhost podman[300185]: Feb 1 04:47:35 localhost podman[300185]: 2026-02-01 09:47:35.12097867 +0000 UTC m=+0.081542498 container create 39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hawking, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:47:35 localhost systemd[1]: Started libpod-conmon-39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae.scope. Feb 1 04:47:35 localhost systemd[1]: Started libcrun container. Feb 1 04:47:35 localhost podman[300185]: 2026-02-01 09:47:35.090177533 +0000 UTC m=+0.050741381 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:35 localhost podman[300185]: 2026-02-01 09:47:35.191913921 +0000 UTC m=+0.152477739 container init 39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hawking, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container) Feb 1 04:47:35 localhost podman[300185]: 2026-02-01 09:47:35.20355521 +0000 UTC m=+0.164119028 container start 39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hawking, distribution-scope=public, GIT_BRANCH=main, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7) Feb 1 04:47:35 localhost podman[300185]: 2026-02-01 09:47:35.203980953 +0000 UTC m=+0.164544771 container attach 39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hawking, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, ceph=True, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:47:35 localhost objective_hawking[300200]: 167 167 Feb 1 04:47:35 localhost systemd[1]: libpod-39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae.scope: Deactivated successfully. Feb 1 04:47:35 localhost podman[300185]: 2026-02-01 09:47:35.207192652 +0000 UTC m=+0.167756500 container died 39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hawking, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, ceph=True, name=rhceph, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, release=1764794109) Feb 1 04:47:35 localhost podman[300205]: 2026-02-01 09:47:35.318429202 +0000 UTC m=+0.097487729 container remove 39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_hawking, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, vcs-type=git, release=1764794109) Feb 1 04:47:35 localhost systemd[1]: libpod-conmon-39d3aed2c3c400c5508fb0995e728183dbb28d9168151499a7ec0d9f33a41cae.scope: Deactivated successfully. Feb 1 04:47:35 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:47:35 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:47:35 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:35 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:35 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:35 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:47:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 1 04:47:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:47:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 1 04:47:35 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch Feb 1 04:47:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:35 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:35 localhost systemd[1]: tmp-crun.GPS9NO.mount: Deactivated successfully. Feb 1 04:47:35 localhost systemd[1]: var-lib-containers-storage-overlay-d1d46d0a2be9e64ac0e7330d72cbc051a40108b0a4c25491eebbcef744862715-merged.mount: Deactivated successfully. Feb 1 04:47:36 localhost podman[300274]: Feb 1 04:47:36 localhost podman[300274]: 2026-02-01 09:47:36.112032864 +0000 UTC m=+0.075937846 container create b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_carver, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, ceph=True, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, release=1764794109, io.openshift.tags=rhceph ceph) Feb 1 04:47:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.146605) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939256146690, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2510, "num_deletes": 260, "total_data_size": 7843022, "memory_usage": 8246488, "flush_reason": "Manual Compaction"} Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Feb 1 04:47:36 localhost systemd[1]: Started libpod-conmon-b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc.scope. Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939256167535, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 4794368, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18210, "largest_seqno": 20715, "table_properties": {"data_size": 4784040, "index_size": 6257, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26782, "raw_average_key_size": 22, "raw_value_size": 4761221, "raw_average_value_size": 3957, "num_data_blocks": 273, "num_entries": 1203, "num_filter_entries": 1203, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939174, "oldest_key_time": 1769939174, "file_creation_time": 1769939256, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 20981 microseconds, and 9596 cpu microseconds. Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:47:36 localhost systemd[1]: Started libcrun container. Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.167593) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 4794368 bytes OK Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.167618) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.169868) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.169889) EVENT_LOG_v1 {"time_micros": 1769939256169883, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.169914) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 7830724, prev total WAL file size 7831214, number of live WAL files 2. Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.171304) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353132' seq:72057594037927935, type:22 .. '6C6F676D0033373634' seq:0, type:0; will stop at (end) Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(4682KB)], [30(18MB)] Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939256171380, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 24543312, "oldest_snapshot_seqno": -1} Feb 1 04:47:36 localhost podman[300274]: 2026-02-01 09:47:36.081711811 +0000 UTC m=+0.045616883 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:36 localhost podman[300274]: 2026-02-01 09:47:36.186157453 +0000 UTC m=+0.150062445 container init b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_carver, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, release=1764794109, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4) Feb 1 04:47:36 localhost frosty_carver[300289]: 167 167 Feb 1 04:47:36 localhost podman[300274]: 2026-02-01 09:47:36.204472356 +0000 UTC m=+0.168377338 container start b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_carver, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, version=7, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:47:36 localhost podman[300274]: 2026-02-01 09:47:36.206644743 +0000 UTC m=+0.170549765 container attach b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_carver, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z) Feb 1 04:47:36 localhost systemd[1]: libpod-b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc.scope: Deactivated successfully. Feb 1 04:47:36 localhost podman[300274]: 2026-02-01 09:47:36.210616675 +0000 UTC m=+0.174521687 container died b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_carver, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11522 keys, 24300898 bytes, temperature: kUnknown Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939256279961, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 24300898, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 24233561, "index_size": 37412, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 309272, "raw_average_key_size": 26, "raw_value_size": 24035490, "raw_average_value_size": 2086, "num_data_blocks": 1432, "num_entries": 11522, "num_filter_entries": 11522, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939256, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.280319) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 24300898 bytes Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.282641) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.8 rd, 223.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.6, 18.8 +0.0 blob) out(23.2 +0.0 blob), read-write-amplify(10.2) write-amplify(5.1) OK, records in: 12079, records dropped: 557 output_compression: NoCompression Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.282680) EVENT_LOG_v1 {"time_micros": 1769939256282663, "job": 16, "event": "compaction_finished", "compaction_time_micros": 108692, "compaction_time_cpu_micros": 57121, "output_level": 6, "num_output_files": 1, "total_output_size": 24300898, "num_input_records": 12079, "num_output_records": 11522, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939256283588, "job": 16, "event": "table_file_deletion", "file_number": 32} Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939256286700, "job": 16, "event": "table_file_deletion", "file_number": 30} Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.171232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.286742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.286747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.286749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.286751) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:36 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:36.286753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:36 localhost podman[300294]: 2026-02-01 09:47:36.306962127 +0000 UTC m=+0.086915683 container remove b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_carver, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, release=1764794109, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, distribution-scope=public, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:47:36 localhost systemd[1]: libpod-conmon-b7532e38488ad070dcf1f355841c13fd70b292fa2ea0417f56ec113aa050b3fc.scope: Deactivated successfully. Feb 1 04:47:36 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:47:36 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:47:36 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:36 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:36 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:47:36 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:47:36 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:47:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 1 04:47:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:47:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:36 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:36 localhost systemd[1]: tmp-crun.OXVoPV.mount: Deactivated successfully. Feb 1 04:47:36 localhost systemd[1]: var-lib-containers-storage-overlay-829890b3f55313f30de9b0fe6bda68fddcb18c2519a6891d7910970fd65ad131-merged.mount: Deactivated successfully. Feb 1 04:47:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:47:37 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:47:37 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Feb 1 04:47:37 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:47:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:37 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:37 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:37 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:37 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:47:37 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:47:37 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:47:37 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:37 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:37 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:47:37 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:47:37 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:47:37 localhost nova_compute[274651]: 2026-02-01 09:47:37.909 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:37 localhost nova_compute[274651]: 2026-02-01 09:47:37.912 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:37 localhost nova_compute[274651]: 2026-02-01 09:47:37.912 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:37 localhost nova_compute[274651]: 2026-02-01 09:47:37.913 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:37 localhost nova_compute[274651]: 2026-02-01 09:47:37.933 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:37 localhost nova_compute[274651]: 2026-02-01 09:47:37.934 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:47:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:47:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Feb 1 04:47:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:47:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:38 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:38 localhost sshd[300312]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:47:39 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:39 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:39 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:47:39 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:47:39 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:47:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:47:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:47:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 1 04:47:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:47:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:39 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:47:40 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:47:40 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 1 04:47:40 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:47:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 1 04:47:40 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch Feb 1 04:47:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:40 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:40 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:40 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:40 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:47:40 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:47:40 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:47:40 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:40 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:40 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:47:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:47:40 localhost podman[300314]: 2026-02-01 09:47:40.736245461 +0000 UTC m=+0.093935270 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 04:47:40 localhost podman[300314]: 2026-02-01 09:47:40.774486066 +0000 UTC m=+0.132175845 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:47:40 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:47:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:47:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:47:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 1 04:47:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:47:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:41 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:41 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:47:41 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:47:41 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:41 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:41 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:47:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:47:41.711 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:47:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:47:41.712 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:47:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:47:41.712 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:47:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Feb 1 04:47:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:47:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:42 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:42 localhost sshd[300332]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:47:42 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:47:42 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:47:42 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:42 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:42 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:47:42 localhost nova_compute[274651]: 2026-02-01 09:47:42.935 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:42 localhost nova_compute[274651]: 2026-02-01 09:47:42.974 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:42 localhost nova_compute[274651]: 2026-02-01 09:47:42.974 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:42 localhost nova_compute[274651]: 2026-02-01 09:47:42.974 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:42 localhost nova_compute[274651]: 2026-02-01 09:47:42.976 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:42 localhost nova_compute[274651]: 2026-02-01 09:47:42.977 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:43 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:43 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Feb 1 04:47:43 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:47:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:43 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:43 localhost nova_compute[274651]: 2026-02-01 09:47:43.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:43 localhost nova_compute[274651]: 2026-02-01 09:47:43.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:43 localhost nova_compute[274651]: 2026-02-01 09:47:43.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:47:43 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:47:43 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:47:43 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:43 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:43 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:44 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:47:44 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:47:44 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:44 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:44 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.374744) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939264374811, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 544, "num_deletes": 251, "total_data_size": 479030, "memory_usage": 489432, "flush_reason": "Manual Compaction"} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939264380166, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 422698, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20716, "largest_seqno": 21259, "table_properties": {"data_size": 419518, "index_size": 1099, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8510, "raw_average_key_size": 21, "raw_value_size": 412854, "raw_average_value_size": 1029, "num_data_blocks": 44, "num_entries": 401, "num_filter_entries": 401, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939256, "oldest_key_time": 1769939256, "file_creation_time": 1769939264, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 5477 microseconds, and 2383 cpu microseconds. Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.380221) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 422698 bytes OK Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.380249) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.382162) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.382199) EVENT_LOG_v1 {"time_micros": 1769939264382191, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.382223) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 475775, prev total WAL file size 475775, number of live WAL files 2. Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.383131) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(412KB)], [33(23MB)] Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939264383192, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 24723596, "oldest_snapshot_seqno": -1} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 11401 keys, 21045812 bytes, temperature: kUnknown Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939264470299, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 21045812, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20980915, "index_size": 35292, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28549, "raw_key_size": 307475, "raw_average_key_size": 26, "raw_value_size": 20786532, "raw_average_value_size": 1823, "num_data_blocks": 1341, "num_entries": 11401, "num_filter_entries": 11401, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939264, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.470649) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 21045812 bytes Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.472564) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 283.6 rd, 241.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 23.2 +0.0 blob) out(20.1 +0.0 blob), read-write-amplify(108.3) write-amplify(49.8) OK, records in: 11923, records dropped: 522 output_compression: NoCompression Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.472586) EVENT_LOG_v1 {"time_micros": 1769939264472576, "job": 18, "event": "compaction_finished", "compaction_time_micros": 87173, "compaction_time_cpu_micros": 54331, "output_level": 6, "num_output_files": 1, "total_output_size": 21045812, "num_input_records": 11923, "num_output_records": 11401, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939264472723, "job": 18, "event": "table_file_deletion", "file_number": 35} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939264475194, "job": 18, "event": "table_file_deletion", "file_number": 33} Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.382965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.475282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.475289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.475292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.475296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:44 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:47:44.475299) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch Feb 1 04:47:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:44 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.296 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.297 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:47:45 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:47:45 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:47:45 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:45 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:45 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:47:45 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:47:45 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:47:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:47:45 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4251366698' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.747 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:47:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:45 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.825 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:47:45 localhost nova_compute[274651]: 2026-02-01 09:47:45.826 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:47:45 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.027 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.028 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11456MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.029 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.029 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.114 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.115 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.116 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:47:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.169 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:47:46 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:46 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:47:46 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1600835519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.595 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.601 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.621 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.624 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:47:46 localhost nova_compute[274651]: 2026-02-01 09:47:46.624 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:47:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 1 04:47:47 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 1 04:47:47 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:47:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:47 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:47 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:47 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:47:47 localhost ceph-mon[286721]: Deploying daemon mon.np0005604215 on np0005604215.localdomain Feb 1 04:47:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:47 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:47 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:47 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 1 04:47:47 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:47:47 localhost nova_compute[274651]: 2026-02-01 09:47:47.621 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:47 localhost nova_compute[274651]: 2026-02-01 09:47:47.622 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:47 localhost nova_compute[274651]: 2026-02-01 09:47:47.622 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:47 localhost nova_compute[274651]: 2026-02-01 09:47:47.978 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:48 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:48 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:48 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:47:48 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:48 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:48 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:48 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:48 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:48 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:47:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:47:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.296 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.296 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.297 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.368 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.369 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.369 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:47:49 localhost nova_compute[274651]: 2026-02-01 09:47:49.369 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:47:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:47:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:49 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:49 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:47:49 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 1 04:47:49 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 1 04:47:49 localhost podman[300698]: 2026-02-01 09:47:49.734053147 +0000 UTC m=+0.090862385 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:47:49 localhost podman[300698]: 2026-02-01 09:47:49.745591782 +0000 UTC m=+0.102401050 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:47:49 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:49 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 1 04:47:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:50 localhost nova_compute[274651]: 2026-02-01 09:47:50.221 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:47:50 localhost nova_compute[274651]: 2026-02-01 09:47:50.236 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:47:50 localhost nova_compute[274651]: 2026-02-01 09:47:50.236 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:47:50 localhost nova_compute[274651]: 2026-02-01 09:47:50.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:50 localhost nova_compute[274651]: 2026-02-01 09:47:50.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).monmap v14 adding/updating np0005604215 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:50 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:50 localhost ceph-mgr[278591]: ms_deliver_dispatch: unhandled message 0x55ea7dea22c0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0) Feb 1 04:47:50 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0) Feb 1 04:47:50 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:50 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:50 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:47:50 localhost ceph-mon[286721]: paxos.0).electionLogic(64) init, last seen epoch 64 Feb 1 04:47:50 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:47:51 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:51 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:51 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:52 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:52 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:52 localhost nova_compute[274651]: 2026-02-01 09:47:52.981 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:52 localhost nova_compute[274651]: 2026-02-01 09:47:52.983 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:47:52 localhost nova_compute[274651]: 2026-02-01 09:47:52.984 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:47:52 localhost nova_compute[274651]: 2026-02-01 09:47:52.984 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:53 localhost nova_compute[274651]: 2026-02-01 09:47:53.018 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:53 localhost nova_compute[274651]: 2026-02-01 09:47:53.019 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:47:53 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:53 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:53 localhost podman[236886]: time="2026-02-01T09:47:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:47:53 localhost podman[236886]: @ - - [01/Feb/2026:09:47:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:47:54 localhost podman[236886]: @ - - [01/Feb/2026:09:47:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18807 "" "Go-http-client/1.1" Feb 1 04:47:54 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:54 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:54 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:47:54 localhost ceph-mds[277455]: mds.beacon.mds.np0005604212.tkdkxt missed beacon ack from the monitors Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 is new leader, mons np0005604212,np0005604213 in quorum (ranks 0,1) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : monmap epoch 15 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : last_changed 2026-02-01T09:47:50.388496+0000 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : created 2026-02-01T07:37:52.883666+0000 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604215 Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e47: np0005604209.isqrps(active, since 31s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005604212,np0005604213 (MON_DOWN) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005604212,np0005604213 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps'] Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005604212,np0005604213 Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : mon.np0005604215 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 1 04:47:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604212 is new leader, mons np0005604212,np0005604213 in quorum (ranks 0,1) Feb 1 04:47:55 localhost ceph-mon[286721]: Health check failed: 1/3 mons down, quorum np0005604212,np0005604213 (MON_DOWN) Feb 1 04:47:55 localhost ceph-mon[286721]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005604212,np0005604213 Feb 1 04:47:55 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 1 04:47:55 localhost ceph-mon[286721]: stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm Feb 1 04:47:55 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:47:55 localhost ceph-mon[286721]: stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps'] Feb 1 04:47:55 localhost ceph-mon[286721]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005604212,np0005604213 Feb 1 04:47:55 localhost ceph-mon[286721]: mon.np0005604215 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Feb 1 04:47:55 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:55 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:55 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:55 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:47:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:47:55 localhost podman[300757]: 2026-02-01 09:47:55.619268787 +0000 UTC m=+0.056371154 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 1 04:47:55 localhost podman[300757]: 2026-02-01 09:47:55.628212822 +0000 UTC m=+0.065315209 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:47:55 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 1 04:47:56 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 1 04:47:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:47:57 localhost podman[301097]: 2026-02-01 09:47:57.138510442 +0000 UTC m=+0.079148495 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:47:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:47:57 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:57 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:57 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:47:57 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:57 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:57 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:47:57 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:57 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:57 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:57 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:57 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:57 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:57 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:57 localhost podman[301097]: 2026-02-01 09:47:57.185446225 +0000 UTC m=+0.126084268 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:47:57 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:47:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:47:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:57 localhost podman[301121]: 2026-02-01 09:47:57.262058631 +0000 UTC m=+0.094905119 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 1 04:47:57 localhost podman[301121]: 2026-02-01 09:47:57.332516937 +0000 UTC m=+0.165363395 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:47:57 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:47:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 calling monitor election Feb 1 04:47:57 localhost ceph-mon[286721]: paxos.0).electionLogic(66) init, last seen epoch 66 Feb 1 04:47:57 localhost ceph-mon[286721]: mon.np0005604212@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : mon.np0005604212 is new leader, mons np0005604212,np0005604213,np0005604215 in quorum (ranks 0,1,2) Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : monmap epoch 15 Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : last_changed 2026-02-01T09:47:50.388496+0000 Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : created 2026-02-01T07:37:52.883666+0000 Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212 Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213 Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604215 Feb 1 04:47:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e47: np0005604209.isqrps(active, since 33s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005604212,np0005604213) Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:47:57 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps'] Feb 1 04:47:57 localhost podman[301198]: Feb 1 04:47:57 localhost podman[301198]: 2026-02-01 09:47:57.871963044 +0000 UTC m=+0.066618999 container create 1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_villani, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:47:57 localhost systemd[1]: Started libpod-conmon-1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3.scope. Feb 1 04:47:57 localhost systemd[1]: Started libcrun container. Feb 1 04:47:57 localhost podman[301198]: 2026-02-01 09:47:57.84093739 +0000 UTC m=+0.035593425 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:57 localhost podman[301198]: 2026-02-01 09:47:57.951594403 +0000 UTC m=+0.146250368 container init 1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_villani, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main) Feb 1 04:47:57 localhost podman[301198]: 2026-02-01 09:47:57.964480918 +0000 UTC m=+0.159136883 container start 1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_villani, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Feb 1 04:47:57 localhost podman[301198]: 2026-02-01 09:47:57.964742696 +0000 UTC m=+0.159398651 container attach 1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_villani, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux , version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True) Feb 1 04:47:57 localhost systemd[1]: libpod-1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3.scope: Deactivated successfully. Feb 1 04:47:57 localhost bold_villani[301213]: 167 167 Feb 1 04:47:57 localhost podman[301198]: 2026-02-01 09:47:57.97425931 +0000 UTC m=+0.168915275 container died 1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 1 04:47:58 localhost nova_compute[274651]: 2026-02-01 09:47:58.019 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:47:58 localhost podman[301218]: 2026-02-01 09:47:58.081589329 +0000 UTC m=+0.093077812 container remove 1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_villani, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, ceph=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:47:58 localhost systemd[1]: libpod-conmon-1818e158d1cb094037ac031c451b33e93e80f72fad57b697e1b32eb3e1cbe6d3.scope: Deactivated successfully. Feb 1 04:47:58 localhost systemd[1]: var-lib-containers-storage-overlay-507fdc1727de3ea4cdb7de86d3197c77f47e67c1396877452496e866d6aad08e-merged.mount: Deactivated successfully. Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604212 calling monitor election Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604215 calling monitor election Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604213 calling monitor election Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604212 is new leader, mons np0005604212,np0005604213,np0005604215 in quorum (ranks 0,1,2) Feb 1 04:47:58 localhost ceph-mon[286721]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005604212,np0005604213) Feb 1 04:47:58 localhost ceph-mon[286721]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:47:58 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 1 04:47:58 localhost ceph-mon[286721]: stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm Feb 1 04:47:58 localhost ceph-mon[286721]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:47:58 localhost ceph-mon[286721]: stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps'] Feb 1 04:47:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Feb 1 04:47:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:58 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:47:58 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:47:58 localhost podman[301288]: Feb 1 04:47:58 localhost podman[301288]: 2026-02-01 09:47:58.876807551 +0000 UTC m=+0.078955038 container create 1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_ritchie, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:47:58 localhost systemd[1]: Started libpod-conmon-1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79.scope. Feb 1 04:47:58 localhost systemd[1]: Started libcrun container. Feb 1 04:47:58 localhost podman[301288]: 2026-02-01 09:47:58.943384229 +0000 UTC m=+0.145531716 container init 1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_ritchie, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, release=1764794109, maintainer=Guillaume Abrioux ) Feb 1 04:47:58 localhost podman[301288]: 2026-02-01 09:47:58.846312183 +0000 UTC m=+0.048459710 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:58 localhost podman[301288]: 2026-02-01 09:47:58.952918871 +0000 UTC m=+0.155066348 container start 1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_ritchie, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, RELEASE=main) Feb 1 04:47:58 localhost podman[301288]: 2026-02-01 09:47:58.953239361 +0000 UTC m=+0.155386838 container attach 1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_ritchie, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, release=1764794109, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.) Feb 1 04:47:58 localhost youthful_ritchie[301303]: 167 167 Feb 1 04:47:58 localhost systemd[1]: libpod-1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79.scope: Deactivated successfully. Feb 1 04:47:58 localhost podman[301288]: 2026-02-01 09:47:58.957461961 +0000 UTC m=+0.159609448 container died 1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_ritchie, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, maintainer=Guillaume Abrioux ) Feb 1 04:47:59 localhost podman[301308]: 2026-02-01 09:47:59.059480158 +0000 UTC m=+0.087608145 container remove 1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_ritchie, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7) Feb 1 04:47:59 localhost systemd[1]: libpod-conmon-1a25c9a65da07958a1d3d1a00c7e086e5e27847f0554d26460712387284d8f79.scope: Deactivated successfully. Feb 1 04:47:59 localhost systemd[1]: var-lib-containers-storage-overlay-677d0243e05871558661d6ebcf7bc07b2cea8169b97c796c93322b319dfe4e84-merged.mount: Deactivated successfully. Feb 1 04:47:59 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:59 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:59 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:47:59 localhost ceph-mon[286721]: Reconfiguring osd.1 (monmap changed)... Feb 1 04:47:59 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:47:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:47:59 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:47:59 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:47:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Feb 1 04:47:59 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:47:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:47:59 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:47:59 localhost podman[301383]: Feb 1 04:47:59 localhost podman[301383]: 2026-02-01 09:47:59.91029634 +0000 UTC m=+0.078699491 container create 377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_wu, RELEASE=main, name=rhceph, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, architecture=x86_64, ceph=True, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., release=1764794109) Feb 1 04:47:59 localhost systemd[1]: Started libpod-conmon-377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e.scope. Feb 1 04:47:59 localhost systemd[1]: Started libcrun container. Feb 1 04:47:59 localhost podman[301383]: 2026-02-01 09:47:59.976882477 +0000 UTC m=+0.145285618 container init 377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_wu, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, release=1764794109, CEPH_POINT_RELEASE=, architecture=x86_64) Feb 1 04:47:59 localhost podman[301383]: 2026-02-01 09:47:59.880497943 +0000 UTC m=+0.048901114 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:47:59 localhost podman[301383]: 2026-02-01 09:47:59.98770019 +0000 UTC m=+0.156103331 container start 377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_wu, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, build-date=2025-12-08T17:28:53Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:47:59 localhost podman[301383]: 2026-02-01 09:47:59.987995879 +0000 UTC m=+0.156399020 container attach 377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_wu, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, release=1764794109, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux ) Feb 1 04:47:59 localhost focused_wu[301399]: 167 167 Feb 1 04:47:59 localhost systemd[1]: libpod-377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e.scope: Deactivated successfully. Feb 1 04:47:59 localhost podman[301383]: 2026-02-01 09:47:59.992831327 +0000 UTC m=+0.161234518 container died 377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_wu, maintainer=Guillaume Abrioux , io.openshift.expose-services=, name=rhceph, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, ceph=True, release=1764794109, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 1 04:48:00 localhost podman[301404]: 2026-02-01 09:48:00.097984101 +0000 UTC m=+0.093029432 container remove 377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_wu, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, vcs-type=git, ceph=True, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 1 04:48:00 localhost systemd[1]: libpod-conmon-377e50dd1127dfd7701c9917576d20b4d5fb3247085cbcbd297ded0f3f51402e.scope: Deactivated successfully. Feb 1 04:48:00 localhost systemd[1]: var-lib-containers-storage-overlay-3d1ee3d3ec8004331868b1c34d85a242e39d709fc3968c67ab5b100cc9cb9974-merged.mount: Deactivated successfully. Feb 1 04:48:00 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:00 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:00 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:48:00 localhost ceph-mon[286721]: Reconfiguring osd.4 (monmap changed)... Feb 1 04:48:00 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:48:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:00 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:00 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 1 04:48:00 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:48:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:00 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:48:00 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:00 localhost podman[301479]: Feb 1 04:48:00 localhost podman[301479]: 2026-02-01 09:48:00.915583761 +0000 UTC m=+0.078095143 container create 47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_easley, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, release=1764794109, maintainer=Guillaume Abrioux , version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.expose-services=) Feb 1 04:48:00 localhost systemd[1]: Started libpod-conmon-47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11.scope. Feb 1 04:48:00 localhost systemd[1]: Started libcrun container. Feb 1 04:48:00 localhost podman[301479]: 2026-02-01 09:48:00.878514711 +0000 UTC m=+0.041026123 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:48:00 localhost podman[301479]: 2026-02-01 09:48:00.978659039 +0000 UTC m=+0.141170381 container init 47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_easley, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:48:00 localhost podman[301479]: 2026-02-01 09:48:00.989588236 +0000 UTC m=+0.152099578 container start 47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_easley, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, release=1764794109, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, RELEASE=main, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:48:00 localhost podman[301479]: 2026-02-01 09:48:00.989849884 +0000 UTC m=+0.152361236 container attach 47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_easley, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=rhceph, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64) Feb 1 04:48:00 localhost gifted_easley[301494]: 167 167 Feb 1 04:48:00 localhost systemd[1]: libpod-47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11.scope: Deactivated successfully. Feb 1 04:48:00 localhost podman[301479]: 2026-02-01 09:48:00.993920919 +0000 UTC m=+0.156432301 container died 47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_easley, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, release=1764794109, name=rhceph, version=7, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 1 04:48:01 localhost podman[301499]: 2026-02-01 09:48:01.097272987 +0000 UTC m=+0.089597996 container remove 47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_easley, name=rhceph, release=1764794109, com.redhat.component=rhceph-container, ceph=True, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.41.4, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7) Feb 1 04:48:01 localhost systemd[1]: libpod-conmon-47b4cfd9ded140207649eb73b1ff9c493ad791a80bb31cc9c7cddc1466d0be11.scope: Deactivated successfully. Feb 1 04:48:01 localhost systemd[1]: var-lib-containers-storage-overlay-ce4b4c72911b474af4e112c6dad88de7e3e189d98daf891522413a3694b55906-merged.mount: Deactivated successfully. Feb 1 04:48:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:01 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:01 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 1 04:48:01 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:48:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 1 04:48:01 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch Feb 1 04:48:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:01 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:01 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:01 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:01 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:48:01 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)... Feb 1 04:48:01 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain Feb 1 04:48:01 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:01 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:01 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:01 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:48:01 localhost openstack_network_exporter[239441]: ERROR 09:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:48:01 localhost openstack_network_exporter[239441]: Feb 1 04:48:01 localhost openstack_network_exporter[239441]: ERROR 09:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:48:01 localhost openstack_network_exporter[239441]: Feb 1 04:48:01 localhost podman[301570]: Feb 1 04:48:01 localhost podman[301570]: 2026-02-01 09:48:01.811365485 +0000 UTC m=+0.076904777 container create 02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_sinoussi, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, release=1764794109, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:48:01 localhost systemd[1]: Started libpod-conmon-02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37.scope. Feb 1 04:48:01 localhost systemd[1]: Started libcrun container. Feb 1 04:48:01 localhost podman[301570]: 2026-02-01 09:48:01.877944141 +0000 UTC m=+0.143483443 container init 02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_sinoussi, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, release=1764794109, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_CLEAN=True) Feb 1 04:48:01 localhost podman[301570]: 2026-02-01 09:48:01.779602147 +0000 UTC m=+0.045141479 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:48:01 localhost podman[301570]: 2026-02-01 09:48:01.888213037 +0000 UTC m=+0.153752339 container start 02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_sinoussi, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-12-08T17:28:53Z, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64) Feb 1 04:48:01 localhost podman[301570]: 2026-02-01 09:48:01.888566868 +0000 UTC m=+0.154106160 container attach 02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_sinoussi, description=Red Hat Ceph Storage 7, release=1764794109, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True) Feb 1 04:48:01 localhost fervent_sinoussi[301585]: 167 167 Feb 1 04:48:01 localhost systemd[1]: libpod-02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37.scope: Deactivated successfully. Feb 1 04:48:01 localhost podman[301570]: 2026-02-01 09:48:01.891068105 +0000 UTC m=+0.156607457 container died 02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_sinoussi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc.) Feb 1 04:48:01 localhost podman[301590]: 2026-02-01 09:48:01.992736061 +0000 UTC m=+0.087579424 container remove 02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=fervent_sinoussi, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, GIT_BRANCH=main, release=1764794109) Feb 1 04:48:01 localhost systemd[1]: libpod-conmon-02220016ab5b4f0f5bb03409b744ffff08ca8f938b76d3951c244e0f9e93ba37.scope: Deactivated successfully. Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:02 localhost systemd[1]: var-lib-containers-storage-overlay-521f931ca72d5e24b931498317ebd8a581b36d3c09000f41dd8178af6598f00a-merged.mount: Deactivated successfully. Feb 1 04:48:02 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)... Feb 1 04:48:02 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain Feb 1 04:48:02 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:03 localhost nova_compute[274651]: 2026-02-01 09:48:03.023 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:48:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:03 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:03 localhost ceph-mon[286721]: Reconfiguring crash.np0005604213 (monmap changed)... Feb 1 04:48:03 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:03 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: Reconfig service osd.default_drive_group Feb 1 04:48:04 localhost ceph-mon[286721]: Reconfiguring osd.0 (monmap changed)... Feb 1 04:48:04 localhost ceph-mon[286721]: Reconfiguring daemon osd.0 on np0005604213.localdomain Feb 1 04:48:04 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' Feb 1 04:48:04 localhost ceph-mon[286721]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e89 do_prune osdmap full prune enabled Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Activating manager daemon np0005604213.caiaeh Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 e90: 6 total, 6 up, 6 in Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e48: np0005604213.caiaeh(active, starting, since 0.0397891s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).mds e16 all = 0 Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).mds e16 all = 0 Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).mds e16 all = 0 Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 0} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 1} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 2} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 3} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 4} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 5} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mds metadata"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).mds e16 all = 1 Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd metadata"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon metadata"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata"} : dispatch Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Manager daemon np0005604213.caiaeh is now available Feb 1 04:48:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:48:04 localhost systemd[1]: session-73.scope: Deactivated successfully. Feb 1 04:48:04 localhost systemd[1]: session-73.scope: Consumed 16.519s CPU time. Feb 1 04:48:04 localhost systemd-logind[759]: Session 73 logged out. Waiting for processes to exit. Feb 1 04:48:04 localhost systemd-logind[759]: Removed session 73. Feb 1 04:48:04 localhost podman[301607]: 2026-02-01 09:48:04.739299593 +0000 UTC m=+0.087961336 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, distribution-scope=public, vendor=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} : dispatch Feb 1 04:48:04 localhost podman[301607]: 2026-02-01 09:48:04.757385549 +0000 UTC m=+0.106047342 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public) Feb 1 04:48:04 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:48:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} v 0) Feb 1 04:48:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} : dispatch Feb 1 04:48:04 localhost sshd[301626]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:48:04 localhost systemd-logind[759]: New session 74 of user ceph-admin. Feb 1 04:48:05 localhost systemd[1]: Started Session 74 of User ceph-admin. Feb 1 04:48:05 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:48:05 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:48:05 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:48:05 localhost ceph-mon[286721]: Activating manager daemon np0005604213.caiaeh Feb 1 04:48:05 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/1659607' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:48:05 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:48:05 localhost ceph-mon[286721]: Manager daemon np0005604213.caiaeh is now available Feb 1 04:48:05 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} : dispatch Feb 1 04:48:05 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} : dispatch Feb 1 04:48:05 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e49: np0005604213.caiaeh(active, since 1.06946s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm Feb 1 04:48:06 localhost systemd[1]: tmp-crun.n2frnK.mount: Deactivated successfully. Feb 1 04:48:06 localhost podman[301734]: 2026-02-01 09:48:06.068295427 +0000 UTC m=+0.098678545 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, architecture=x86_64, version=7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, maintainer=Guillaume Abrioux , ceph=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, release=1764794109, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:48:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:06 localhost podman[301734]: 2026-02-01 09:48:06.206313961 +0000 UTC m=+0.236697109 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, vendor=Red Hat, Inc., distribution-scope=public, release=1764794109, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Cluster is now healthy Feb 1 04:48:06 localhost ceph-mon[286721]: [01/Feb/2026:09:48:05] ENGINE Bus STARTING Feb 1 04:48:06 localhost ceph-mon[286721]: [01/Feb/2026:09:48:06] ENGINE Serving on https://172.18.0.107:7150 Feb 1 04:48:06 localhost ceph-mon[286721]: [01/Feb/2026:09:48:06] ENGINE Client ('172.18.0.107', 42754) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:48:06 localhost ceph-mon[286721]: [01/Feb/2026:09:48:06] ENGINE Serving on http://172.18.0.107:8765 Feb 1 04:48:06 localhost ceph-mon[286721]: [01/Feb/2026:09:48:06] ENGINE Bus STARTED Feb 1 04:48:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:07 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e50: np0005604213.caiaeh(active, since 2s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm Feb 1 04:48:07 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:48:07 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:48:07 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:48:07 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:07 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:07 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:07 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:07 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:07 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:08 localhost nova_compute[274651]: 2026-02-01 09:48:08.057 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:08 localhost nova_compute[274651]: 2026-02-01 09:48:08.059 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:08 localhost nova_compute[274651]: 2026-02-01 09:48:08.060 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:48:08 localhost nova_compute[274651]: 2026-02-01 09:48:08.060 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:08 localhost nova_compute[274651]: 2026-02-01 09:48:08.062 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:08 localhost nova_compute[274651]: 2026-02-01 09:48:08.062 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:08 localhost nova_compute[274651]: 2026-02-01 09:48:08.066 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 1 04:48:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:48:09 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:48:09 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:48:09 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:48:09 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:48:09 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:48:09 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:48:09 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:48:09 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:48:09 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e51: np0005604213.caiaeh(active, since 4s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm Feb 1 04:48:09 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : Standby manager daemon np0005604209.isqrps started Feb 1 04:48:10 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:48:10 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:48:10 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:48:10 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e52: np0005604213.caiaeh(active, since 5s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm, np0005604209.isqrps Feb 1 04:48:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} v 0) Feb 1 04:48:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch Feb 1 04:48:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:48:10 localhost podman[302528]: 2026-02-01 09:48:10.994875522 +0000 UTC m=+0.096776427 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 1 04:48:11 localhost podman[302528]: 2026-02-01 09:48:11.034666205 +0000 UTC m=+0.136567100 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:48:11 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:48:11 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:48:11 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:48:11 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:48:11 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:48:11 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:48:11 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:48:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:11 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:12 localhost podman[302706]: Feb 1 04:48:12 localhost podman[302706]: 2026-02-01 09:48:12.207823818 +0000 UTC m=+0.069780117 container create c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_ellis, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, RELEASE=main) Feb 1 04:48:12 localhost systemd[1]: Started libpod-conmon-c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18.scope. Feb 1 04:48:12 localhost systemd[1]: Started libcrun container. Feb 1 04:48:12 localhost podman[302706]: 2026-02-01 09:48:12.177401232 +0000 UTC m=+0.039357551 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:48:12 localhost podman[302706]: 2026-02-01 09:48:12.280706449 +0000 UTC m=+0.142662748 container init c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_ellis, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, version=7, build-date=2025-12-08T17:28:53Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, release=1764794109, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5) Feb 1 04:48:12 localhost podman[302706]: 2026-02-01 09:48:12.290368946 +0000 UTC m=+0.152325235 container start c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_ellis, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-12-08T17:28:53Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True) Feb 1 04:48:12 localhost podman[302706]: 2026-02-01 09:48:12.290573432 +0000 UTC m=+0.152529721 container attach c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_ellis, ceph=True, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1764794109, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container) Feb 1 04:48:12 localhost compassionate_ellis[302721]: 167 167 Feb 1 04:48:12 localhost systemd[1]: libpod-c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18.scope: Deactivated successfully. Feb 1 04:48:12 localhost podman[302706]: 2026-02-01 09:48:12.294348739 +0000 UTC m=+0.156305058 container died c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_ellis, release=1764794109, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, name=rhceph, RELEASE=main, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 04:48:12 localhost podman[302726]: 2026-02-01 09:48:12.386947896 +0000 UTC m=+0.083435987 container remove c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_ellis, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, RELEASE=main) Feb 1 04:48:12 localhost systemd[1]: libpod-conmon-c4e469b448dd166a1fbf6dbf89b758486d2a5d8f5564103f384b1e3ffba7fd18.scope: Deactivated successfully. Feb 1 04:48:12 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:48:12 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:48:12 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:48:12 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 1 04:48:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:12 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:12 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:12 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:12 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Feb 1 04:48:12 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:48:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:12 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:13 localhost nova_compute[274651]: 2026-02-01 09:48:13.104 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:13 localhost nova_compute[274651]: 2026-02-01 09:48:13.106 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:13 localhost nova_compute[274651]: 2026-02-01 09:48:13.106 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5040 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:48:13 localhost nova_compute[274651]: 2026-02-01 09:48:13.106 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:13 localhost nova_compute[274651]: 2026-02-01 09:48:13.107 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:13 localhost nova_compute[274651]: 2026-02-01 09:48:13.110 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:13 localhost systemd[1]: var-lib-containers-storage-overlay-783a3c740f11990bb90a636b6391f1a5cee3aae180f59189ad917d701ff9e210-merged.mount: Deactivated successfully. Feb 1 04:48:13 localhost podman[302803]: Feb 1 04:48:13 localhost podman[302803]: 2026-02-01 09:48:13.23623961 +0000 UTC m=+0.050982099 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:48:13 localhost podman[302803]: 2026-02-01 09:48:13.370124667 +0000 UTC m=+0.184867136 container create 7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lamport, vendor=Red Hat, Inc., name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, CEPH_POINT_RELEASE=, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4) Feb 1 04:48:13 localhost systemd[1]: Started libpod-conmon-7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd.scope. Feb 1 04:48:13 localhost ceph-mon[286721]: Reconfiguring daemon osd.1 on np0005604212.localdomain Feb 1 04:48:13 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 1 04:48:13 localhost systemd[1]: Started libcrun container. Feb 1 04:48:13 localhost podman[302803]: 2026-02-01 09:48:13.451105587 +0000 UTC m=+0.265848056 container init 7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lamport, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True) Feb 1 04:48:13 localhost podman[302803]: 2026-02-01 09:48:13.459297869 +0000 UTC m=+0.274040348 container start 7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lamport, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, architecture=x86_64, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container) Feb 1 04:48:13 localhost podman[302803]: 2026-02-01 09:48:13.459532246 +0000 UTC m=+0.274274715 container attach 7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lamport, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7) Feb 1 04:48:13 localhost objective_lamport[302818]: 167 167 Feb 1 04:48:13 localhost systemd[1]: libpod-7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd.scope: Deactivated successfully. Feb 1 04:48:13 localhost podman[302803]: 2026-02-01 09:48:13.463873359 +0000 UTC m=+0.278615838 container died 7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lamport, distribution-scope=public, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z) Feb 1 04:48:13 localhost podman[302823]: 2026-02-01 09:48:13.556019332 +0000 UTC m=+0.078896636 container remove 7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=objective_lamport, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_BRANCH=main, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, maintainer=Guillaume Abrioux , distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, name=rhceph) Feb 1 04:48:13 localhost systemd[1]: libpod-conmon-7cc7279f31c5b0e155e5aae8f0386d660cfbaad37f2ef092df5006857c0693cd.scope: Deactivated successfully. Feb 1 04:48:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:13 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:13 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:13 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:13 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Feb 1 04:48:13 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:48:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:13 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:14 localhost systemd[1]: var-lib-containers-storage-overlay-334dcd560eb5150af84c616670da1b6626fa1fe87618bf1b9831efa5722f6207-merged.mount: Deactivated successfully. Feb 1 04:48:14 localhost ceph-mon[286721]: Reconfiguring daemon osd.4 on np0005604212.localdomain Feb 1 04:48:14 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 1 04:48:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:48:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 1 04:48:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:48:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:14 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:15 localhost ceph-mon[286721]: Reconfiguring osd.3 (monmap changed)... Feb 1 04:48:15 localhost ceph-mon[286721]: Reconfiguring daemon osd.3 on np0005604213.localdomain Feb 1 04:48:15 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:15 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:15 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:15 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:15 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:15 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:48:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:15 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:15 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 1 04:48:15 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:48:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 1 04:48:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr services"} : dispatch Feb 1 04:48:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:16 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)... Feb 1 04:48:16 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain Feb 1 04:48:16 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:16 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:16 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:48:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:16 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:16 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 1 04:48:16 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:48:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:16 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:17 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)... Feb 1 04:48:17 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain Feb 1 04:48:17 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:17 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:17 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 1 04:48:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:17 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:17 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Feb 1 04:48:17 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:48:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:17 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:18 localhost nova_compute[274651]: 2026-02-01 09:48:18.109 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:18 localhost ceph-mon[286721]: Reconfiguring crash.np0005604215 (monmap changed)... Feb 1 04:48:18 localhost ceph-mon[286721]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain Feb 1 04:48:18 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:18 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:18 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 1 04:48:18 localhost ceph-mon[286721]: Reconfiguring osd.2 (monmap changed)... Feb 1 04:48:18 localhost ceph-mon[286721]: Reconfiguring daemon osd.2 on np0005604215.localdomain Feb 1 04:48:18 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Feb 1 04:48:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:48:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:18 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 1 04:48:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 1 04:48:19 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 1 04:48:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:48:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:19 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:20 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:20 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 1 04:48:20 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:48:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 1 04:48:20 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr services"} : dispatch Feb 1 04:48:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:20 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:20 localhost ceph-mon[286721]: Reconfiguring osd.5 (monmap changed)... Feb 1 04:48:20 localhost ceph-mon[286721]: Reconfiguring daemon osd.5 on np0005604215.localdomain Feb 1 04:48:20 localhost ceph-mon[286721]: Saving service mon spec with placement label:mon Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:20 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 1 04:48:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:48:20 localhost systemd[1]: tmp-crun.ULjLy9.mount: Deactivated successfully. Feb 1 04:48:20 localhost podman[302846]: 2026-02-01 09:48:20.735570542 +0000 UTC m=+0.093859928 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:48:20 localhost podman[302846]: 2026-02-01 09:48:20.750387947 +0000 UTC m=+0.108677323 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:48:20 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:48:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.197426) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301197496, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1775, "num_deletes": 256, "total_data_size": 4276054, "memory_usage": 4411752, "flush_reason": "Manual Compaction"} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301215906, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3596481, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21260, "largest_seqno": 23034, "table_properties": {"data_size": 3588748, "index_size": 4366, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 19766, "raw_average_key_size": 21, "raw_value_size": 3571672, "raw_average_value_size": 3907, "num_data_blocks": 188, "num_entries": 914, "num_filter_entries": 914, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939264, "oldest_key_time": 1769939264, "file_creation_time": 1769939301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 18590 microseconds, and 8329 cpu microseconds. Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.215971) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3596481 bytes OK Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.216050) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.217858) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.217886) EVENT_LOG_v1 {"time_micros": 1769939301217876, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.217915) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 4267441, prev total WAL file size 4267441, number of live WAL files 2. Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.219211) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353131' seq:72057594037927935, type:22 .. '6B760031373637' seq:0, type:0; will stop at (end) Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3512KB)], [36(20MB)] Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301219267, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 24642293, "oldest_snapshot_seqno": -1} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 11817 keys, 23670791 bytes, temperature: kUnknown Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301338183, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 23670791, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 23602473, "index_size": 37689, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29573, "raw_key_size": 318999, "raw_average_key_size": 26, "raw_value_size": 23400066, "raw_average_value_size": 1980, "num_data_blocks": 1427, "num_entries": 11817, "num_filter_entries": 11817, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.338648) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 23670791 bytes Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.340542) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 206.9 rd, 198.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 20.1 +0.0 blob) out(22.6 +0.0 blob), read-write-amplify(13.4) write-amplify(6.6) OK, records in: 12315, records dropped: 498 output_compression: NoCompression Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.340591) EVENT_LOG_v1 {"time_micros": 1769939301340573, "job": 20, "event": "compaction_finished", "compaction_time_micros": 119112, "compaction_time_cpu_micros": 56951, "output_level": 6, "num_output_files": 1, "total_output_size": 23670791, "num_input_records": 12315, "num_output_records": 11817, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301341232, "job": 20, "event": "table_file_deletion", "file_number": 38} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301344568, "job": 20, "event": "table_file_deletion", "file_number": 36} Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.219124) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.344699) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.344708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.344711) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.344714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:48:21 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:48:21.344716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:48:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 1 04:48:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:48:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 1 04:48:21 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 1 04:48:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:21 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:21 localhost ceph-mon[286721]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)... Feb 1 04:48:21 localhost ceph-mon[286721]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain Feb 1 04:48:21 localhost ceph-mon[286721]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)... Feb 1 04:48:21 localhost ceph-mon[286721]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain Feb 1 04:48:21 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:21 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:21 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 1 04:48:22 localhost ceph-mon[286721]: Reconfiguring mon.np0005604215 (monmap changed)... Feb 1 04:48:22 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain Feb 1 04:48:22 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:48:22 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 1 04:48:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:22 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:23 localhost nova_compute[274651]: 2026-02-01 09:48:23.114 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:23 localhost nova_compute[274651]: 2026-02-01 09:48:23.116 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:23 localhost nova_compute[274651]: 2026-02-01 09:48:23.116 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:48:23 localhost nova_compute[274651]: 2026-02-01 09:48:23.116 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:23 localhost nova_compute[274651]: 2026-02-01 09:48:23.139 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:23 localhost nova_compute[274651]: 2026-02-01 09:48:23.141 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:23 localhost podman[302941]: Feb 1 04:48:23 localhost podman[302941]: 2026-02-01 09:48:23.277952645 +0000 UTC m=+0.076071020 container create a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_sutherland, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux , vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, name=rhceph, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z) Feb 1 04:48:23 localhost systemd[1]: Started libpod-conmon-a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015.scope. Feb 1 04:48:23 localhost systemd[1]: Started libcrun container. Feb 1 04:48:23 localhost podman[302941]: 2026-02-01 09:48:23.246023754 +0000 UTC m=+0.044142169 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 1 04:48:23 localhost podman[302941]: 2026-02-01 09:48:23.361245876 +0000 UTC m=+0.159364251 container init a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_sutherland, ceph=True, release=1764794109, maintainer=Guillaume Abrioux , vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container) Feb 1 04:48:23 localhost podman[302941]: 2026-02-01 09:48:23.372238695 +0000 UTC m=+0.170357080 container start a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_sutherland, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, RELEASE=main, version=7) Feb 1 04:48:23 localhost podman[302941]: 2026-02-01 09:48:23.374069381 +0000 UTC m=+0.172187806 container attach a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_sutherland, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=1764794109, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=7, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:48:23 localhost sleepy_sutherland[302956]: 167 167 Feb 1 04:48:23 localhost systemd[1]: libpod-a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015.scope: Deactivated successfully. Feb 1 04:48:23 localhost podman[302941]: 2026-02-01 09:48:23.376773164 +0000 UTC m=+0.174891579 container died a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_sutherland, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph) Feb 1 04:48:23 localhost podman[302961]: 2026-02-01 09:48:23.475402077 +0000 UTC m=+0.088794162 container remove a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_sutherland, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, release=1764794109, io.buildah.version=1.41.4, name=rhceph, maintainer=Guillaume Abrioux , build-date=2025-12-08T17:28:53Z, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z) Feb 1 04:48:23 localhost systemd[1]: libpod-conmon-a3fefcd17f8824d496363c365ce91ec221e5bc1eda5f11b96f8954980de29015.scope: Deactivated successfully. Feb 1 04:48:23 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:48:23 localhost ceph-mon[286721]: Reconfiguring mon.np0005604212 (monmap changed)... Feb 1 04:48:23 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain Feb 1 04:48:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:48:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:48:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 1 04:48:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:48:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 1 04:48:23 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 1 04:48:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 1 04:48:23 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 1 04:48:23 localhost podman[236886]: time="2026-02-01T09:48:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:48:23 localhost podman[236886]: @ - - [01/Feb/2026:09:48:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:48:24 localhost podman[236886]: @ - - [01/Feb/2026:09:48:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18811 "" "Go-http-client/1.1" Feb 1 04:48:24 localhost systemd[1]: var-lib-containers-storage-overlay-78f391f73eb7bb53fe894612e55c1735b2330e022bd4ff6adde63879bc9d6a2b-merged.mount: Deactivated successfully. Feb 1 04:48:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:48:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:48:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:24 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:24 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:24 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 1 04:48:24 localhost ceph-mon[286721]: Reconfiguring mon.np0005604213 (monmap changed)... Feb 1 04:48:24 localhost ceph-mon[286721]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain Feb 1 04:48:24 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:24 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:48:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:25 localhost ceph-mon[286721]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' Feb 1 04:48:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:48:26 localhost podman[302977]: 2026-02-01 09:48:26.729147323 +0000 UTC m=+0.081841767 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:48:26 localhost podman[302977]: 2026-02-01 09:48:26.765450319 +0000 UTC m=+0.118144753 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:48:26 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:48:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:48:27 localhost podman[302993]: 2026-02-01 09:48:27.731637628 +0000 UTC m=+0.085616143 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:48:27 localhost podman[302993]: 2026-02-01 09:48:27.772882256 +0000 UTC m=+0.126860721 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:48:27 localhost podman[302994]: 2026-02-01 09:48:27.785498164 +0000 UTC m=+0.135313001 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS) Feb 1 04:48:27 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:48:27 localhost podman[302994]: 2026-02-01 09:48:27.899470889 +0000 UTC m=+0.249285646 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:48:27 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:48:28 localhost nova_compute[274651]: 2026-02-01 09:48:28.141 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:28 localhost nova_compute[274651]: 2026-02-01 09:48:28.143 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:31 localhost openstack_network_exporter[239441]: ERROR 09:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:48:31 localhost openstack_network_exporter[239441]: Feb 1 04:48:31 localhost openstack_network_exporter[239441]: ERROR 09:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:48:31 localhost openstack_network_exporter[239441]: Feb 1 04:48:33 localhost nova_compute[274651]: 2026-02-01 09:48:33.143 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:33 localhost nova_compute[274651]: 2026-02-01 09:48:33.144 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:33 localhost nova_compute[274651]: 2026-02-01 09:48:33.144 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:48:33 localhost nova_compute[274651]: 2026-02-01 09:48:33.144 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:33 localhost nova_compute[274651]: 2026-02-01 09:48:33.145 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:33 localhost nova_compute[274651]: 2026-02-01 09:48:33.149 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:48:35 localhost systemd[1]: tmp-crun.LZwg3z.mount: Deactivated successfully. Feb 1 04:48:35 localhost podman[303040]: 2026-02-01 09:48:35.732878802 +0000 UTC m=+0.095205978 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9) Feb 1 04:48:35 localhost podman[303040]: 2026-02-01 09:48:35.751530576 +0000 UTC m=+0.113857762 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, vcs-type=git, version=9.7, architecture=x86_64) Feb 1 04:48:35 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:48:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:38 localhost nova_compute[274651]: 2026-02-01 09:48:38.150 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:38 localhost nova_compute[274651]: 2026-02-01 09:48:38.151 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:38 localhost nova_compute[274651]: 2026-02-01 09:48:38.151 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:48:38 localhost nova_compute[274651]: 2026-02-01 09:48:38.152 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:38 localhost nova_compute[274651]: 2026-02-01 09:48:38.179 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:38 localhost nova_compute[274651]: 2026-02-01 09:48:38.180 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:48:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:48:41.712 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:48:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:48:41.712 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:48:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:48:41.713 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:48:41 localhost podman[303059]: 2026-02-01 09:48:41.727037242 +0000 UTC m=+0.088390708 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 1 04:48:41 localhost podman[303059]: 2026-02-01 09:48:41.741480676 +0000 UTC m=+0.102834092 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:48:41 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.181 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.182 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.183 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.183 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.204 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.205 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:43 localhost nova_compute[274651]: 2026-02-01 09:48:43.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.373 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.373 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.374 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.374 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.375 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:48:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:48:45 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1197085737' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.827 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.911 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:48:45 localhost nova_compute[274651]: 2026-02-01 09:48:45.911 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.151 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.153 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11442MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.153 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.154 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:48:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.219 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.219 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.220 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.255 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:48:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:48:46 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2785224664' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.685 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.691 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.710 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.713 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:48:46 localhost nova_compute[274651]: 2026-02-01 09:48:46.713 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:48:47 localhost nova_compute[274651]: 2026-02-01 09:48:47.711 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:47 localhost nova_compute[274651]: 2026-02-01 09:48:47.711 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:48 localhost nova_compute[274651]: 2026-02-01 09:48:48.205 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:48 localhost nova_compute[274651]: 2026-02-01 09:48:48.208 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:48 localhost nova_compute[274651]: 2026-02-01 09:48:48.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:49 localhost nova_compute[274651]: 2026-02-01 09:48:49.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:49 localhost nova_compute[274651]: 2026-02-01 09:48:49.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:48:49 localhost nova_compute[274651]: 2026-02-01 09:48:49.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:48:50 localhost nova_compute[274651]: 2026-02-01 09:48:50.064 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:48:50 localhost nova_compute[274651]: 2026-02-01 09:48:50.065 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:48:50 localhost nova_compute[274651]: 2026-02-01 09:48:50.065 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:48:50 localhost nova_compute[274651]: 2026-02-01 09:48:50.066 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:48:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:48:51 localhost podman[303122]: 2026-02-01 09:48:51.716451759 +0000 UTC m=+0.080617790 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:48:51 localhost podman[303122]: 2026-02-01 09:48:51.727863879 +0000 UTC m=+0.092029910 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:48:51 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:48:51 localhost nova_compute[274651]: 2026-02-01 09:48:51.940 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:48:51 localhost nova_compute[274651]: 2026-02-01 09:48:51.959 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:48:51 localhost nova_compute[274651]: 2026-02-01 09:48:51.960 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:48:51 localhost nova_compute[274651]: 2026-02-01 09:48:51.961 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:51 localhost nova_compute[274651]: 2026-02-01 09:48:51.961 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:48:53 localhost nova_compute[274651]: 2026-02-01 09:48:53.210 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:53 localhost nova_compute[274651]: 2026-02-01 09:48:53.212 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:53 localhost nova_compute[274651]: 2026-02-01 09:48:53.212 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:48:53 localhost nova_compute[274651]: 2026-02-01 09:48:53.212 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:53 localhost nova_compute[274651]: 2026-02-01 09:48:53.228 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:48:53 localhost nova_compute[274651]: 2026-02-01 09:48:53.229 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:48:53 localhost podman[236886]: time="2026-02-01T09:48:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:48:53 localhost podman[236886]: @ - - [01/Feb/2026:09:48:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:48:54 localhost podman[236886]: @ - - [01/Feb/2026:09:48:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18812 "" "Go-http-client/1.1" Feb 1 04:48:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:48:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:48:57 localhost podman[303145]: 2026-02-01 09:48:57.698139554 +0000 UTC m=+0.063779682 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent) Feb 1 04:48:57 localhost podman[303145]: 2026-02-01 09:48:57.706402208 +0000 UTC m=+0.072042376 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent) Feb 1 04:48:57 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:48:58 localhost nova_compute[274651]: 2026-02-01 09:48:58.229 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:48:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:48:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:48:58 localhost podman[303163]: 2026-02-01 09:48:58.721498021 +0000 UTC m=+0.081476517 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:48:58 localhost podman[303163]: 2026-02-01 09:48:58.73121793 +0000 UTC m=+0.091196396 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:48:58 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:48:58 localhost podman[303164]: 2026-02-01 09:48:58.774666495 +0000 UTC m=+0.130985578 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:48:58 localhost podman[303164]: 2026-02-01 09:48:58.849422014 +0000 UTC m=+0.205741127 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible) Feb 1 04:48:58 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:49:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:01 localhost openstack_network_exporter[239441]: ERROR 09:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:49:01 localhost openstack_network_exporter[239441]: Feb 1 04:49:01 localhost openstack_network_exporter[239441]: ERROR 09:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:49:01 localhost openstack_network_exporter[239441]: Feb 1 04:49:03 localhost nova_compute[274651]: 2026-02-01 09:49:03.232 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:03 localhost nova_compute[274651]: 2026-02-01 09:49:03.232 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:03 localhost nova_compute[274651]: 2026-02-01 09:49:03.233 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:03 localhost nova_compute[274651]: 2026-02-01 09:49:03.233 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:03 localhost nova_compute[274651]: 2026-02-01 09:49:03.233 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:03 localhost nova_compute[274651]: 2026-02-01 09:49:03.235 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.528 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.529 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.558 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.559 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c9b20d3-008e-442a-98d0-fa9cdaa7b7a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.529424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3df92d24-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '6c42be80b89e406aca0a2d75e0412fadb59eeb00e9b57881a7e0d9389c2057de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.529424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3df94138-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '5e568d100c5322d3535a69ff56ab805bf27e16b1dfe4309163d555f68b0cc873'}]}, 'timestamp': '2026-02-01 09:49:03.560230', '_unique_id': 'fa8feacf65b04b6a8cd3484fbb42a455'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.562 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.563 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.573 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.574 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '304a44df-3b3d-4486-8329-090e22aef46c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.563573', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3dfb630a-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.783044117, 'message_signature': '4a913db3761b78e8a7358788f44236b00585d42afe402dc498f2cc11f9306ab7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.563573', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3dfb7980-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.783044117, 'message_signature': '9963a28430cba4f8e3ce5f7dc76213a7c4b901229afde415abdf5b7525474850'}]}, 'timestamp': '2026-02-01 09:49:03.574728', '_unique_id': '3c2fe62b46364c6e8525ea07d48e96f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.576 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.577 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7dfcfe4-4d3d-46c9-be64-7cc3c58e8e5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.577448', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3dfc882a-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': 'b051d51360ec0c3f93b126cd40f748ee76af1fea3cf8ce2ac431aeb12c345e7a'}]}, 'timestamp': '2026-02-01 09:49:03.581703', '_unique_id': '260950200e5a4923beaba24c30f38394'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fec5015-9ed5-497b-afc6-22975b5dc0a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.584209', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3dfd00ac-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': '272eb9572599e8464a038b3394dc60d38de0c7b4a214982c3257b6d984fb0ca1'}]}, 'timestamp': '2026-02-01 09:49:03.584728', '_unique_id': '979b1ed6db0c438d90bde4fe37ff2619'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.587 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.587 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eb22e79-ff7c-4494-b47a-5dcb3d5184bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.587426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3dfd7e6a-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '8742f8ae05b2ba7504564ccb5a752666b13c8901a5d7d8fd20fa463b3066414a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.587426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3dfd92b0-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': 'f86deea2d66e9af9d1f3b6125ff6f45cdba9a4fc37dd9b32029541031e30478c'}]}, 'timestamp': '2026-02-01 09:49:03.588465', '_unique_id': 'b68e178cac3840f3b9bbb3ed5700453a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.589 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.590 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.591 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.591 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.591 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03f8383a-8a69-460b-a5ef-a8c8e8a3db82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.591189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3dfe121c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.783044117, 'message_signature': '2f1c7b94ab43043301cd299d73191897c0edf58b32da8b0b7050c79ee6daeb35'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.591189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3dfe24aa-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.783044117, 'message_signature': 'cec32d28abbb2168f00cf52cb1d16a5d6640b419be1ae49c316c68df768ace5d'}]}, 'timestamp': '2026-02-01 09:49:03.592205', '_unique_id': '9ecd49d1fb4943ceaa43be7b43377525'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.594 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.594 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.595 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93b87fea-c1d7-467d-be88-551bbd8f22ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.594686', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3dfe9944-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.783044117, 'message_signature': '82766895b833fe771ee2cec8362496dc67ee3c042a01641acdaf87f04d914c9c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.594686', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3dfeabd2-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.783044117, 'message_signature': '347bf06f487f8ca524ef215204adec3e0d39e852a19a50810fd69295b6112591'}]}, 'timestamp': '2026-02-01 09:49:03.595626', '_unique_id': 'b8d256aa51ca414b9a5eed0c7eadd266'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.596 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.598 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.598 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.598 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c041331-41c3-4b69-aef0-5f36c724e682', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.598748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3dff3a02-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': 'd7e438809d65c09d7a15d503868b94d5bea9b726223318f20310b74f9eadff5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.598748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3dff4bfa-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '5c496e2e4dcddf467e1b6a86ec1fe823190c883a30e76c522a9c0adf2b312c3f'}]}, 'timestamp': '2026-02-01 09:49:03.599728', '_unique_id': 'ca163f53418947b09f2a8679ce8de97b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.600 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.602 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06f9d77f-c603-47bf-a8fd-985715fea173', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.602151', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3dffbefa-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': '11053463388491af63ffc4e02c2d8f5a49ab514f90f02c53abdcd31619d570cf'}]}, 'timestamp': '2026-02-01 09:49:03.602730', '_unique_id': '6cf6a8bde2174a9989ca218bc9e69442'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.603 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.604 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.605 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '583b7bd6-f30a-4226-8ebc-34a5d71cccd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.605129', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3e003074-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': '977ed5b320414bacaed8f61d8ba2ff81f785fa3079b48b2d97a985444237ba4d'}]}, 'timestamp': '2026-02-01 09:49:03.605681', '_unique_id': '93bc99c9a6a24091a74edc0047d63f1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.607 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce64f25a-fd18-47de-8063-d0aab54da3b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.607703', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3e009c12-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': '7c3d5d4b0d2ceca9971436ec384dbcf6107981ea09867875e4ab1eb3863f64f7'}]}, 'timestamp': '2026-02-01 09:49:03.608337', '_unique_id': '1e15e5f36e07495bb0c79f329ba1bdf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.609 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3702d21f-8dc7-49ce-8da2-76a393851557', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.609851', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3e00e69a-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': 'ec12617de7c835573a22260ac5c1d4be4907fb6e06c3099a934fd2e2472b201a'}]}, 'timestamp': '2026-02-01 09:49:03.610180', '_unique_id': '1429e841c35f421aaccdcc9d6a70b15d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.611 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.611 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.611 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cccf489e-f35b-4b63-99e0-74b012fe738e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.611578', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3e0128ee-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '85ba4ea4924f842b9570ec1a3bd9f0bfa38124939e23ad6c4780c49e1c50e8ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.611578', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3e0135aa-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '69348c3e415c1c4a5c602f22898d6c58d56b5e141870d17409a843cd87b5cd48'}]}, 'timestamp': '2026-02-01 09:49:03.612187', '_unique_id': '585ab9060d244afe9240ae30a946e893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.613 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.613 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0668ab3e-7b9d-4ee3-82de-6a3d16d556e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.613586', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3e0177a4-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': 'd0bd81b1c43551af5a6525a804e308628bd8bde9802400252c00bc5bd481e11a'}]}, 'timestamp': '2026-02-01 09:49:03.613893', '_unique_id': 'e02c8168111c4feba76a2f732eed84ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.615 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e3ffc39-e0bd-46e1-803e-64146dc3ff0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.615342', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3e01bbf6-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': '537cce621f74b934ccebad6f241b6a8d7347c4cdabc558f2a2dff54746bb1f2a'}]}, 'timestamp': '2026-02-01 09:49:03.615640', '_unique_id': '3ff01999f314438c8321df02fce80eae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.616 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de40072a-5a6c-440f-b769-2fcf97795ae3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.617012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3e01fdc8-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': 'a056fff3dc03cabbcaa30005deb8d7206eedc5e7416fade4303840c23f013a73'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.617012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3e020944-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '7e653141c167a8ec00e431b9ad9a729399e72540ec596c508304f08777b18139'}]}, 'timestamp': '2026-02-01 09:49:03.617602', '_unique_id': 'a781340b1e1f4574bcd280062c598be4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.618 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.619 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed0a25c7-8b99-4141-b452-682e80eca5ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.619061', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3e025476-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': 'b72a0ddf0bd1dcc0f00ed989af6389e855ca3d7878acc7574facd28085dd625b'}]}, 'timestamp': '2026-02-01 09:49:03.619550', '_unique_id': '6fb1573a271d44b4b491b3a56eef9d57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4181595-71cc-4323-acfd-db247b253e1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:49:03.621022', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '3e0299ea-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.796903743, 'message_signature': '987159a90bc10ee53e97ca59003cfb6bace48e1b960705cc6abd5cca0ad4b2fb'}]}, 'timestamp': '2026-02-01 09:49:03.621320', '_unique_id': 'f448971856c847e492eb1e364730e140'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.622 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.622 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14753932-7987-40fe-a152-242e7ae19c56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:49:03.622815', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3e02df9a-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '698b12a2562851eb6aaf4a4e4d67b582c4ca1421478ed03b7fe1a7bee0fcf8da'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:49:03.622815', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3e02eada-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.748865486, 'message_signature': '00672d47a071aaa3aac562b6e0240a08770b70613c53563f21ff0d7e2a3f0bed'}]}, 'timestamp': '2026-02-01 09:49:03.623377', '_unique_id': 'aa29e796ce53453595af8e9d19ccdba1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.624 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '768d8a38-d5c4-441f-915b-530067ac96bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:49:03.624709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '3e04d5de-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.854917457, 'message_signature': '35571258c0a462eba456e24fce9c1f28c299cd8d99dfb292103fdc9ab4e9a277'}]}, 'timestamp': '2026-02-01 09:49:03.635980', '_unique_id': '547b166f23644e9fa6475de64a1a38da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.637 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.637 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 13680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '563c866f-defb-4b09-a459-894c70edb088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13680000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:49:03.637676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '3e05246c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11437.854917457, 'message_signature': '09658999cbf727462ee6f79397946d4e48e989cfeb2370d991085d7fb4a5169b'}]}, 'timestamp': '2026-02-01 09:49:03.637965', '_unique_id': '4efd3e0c6c3b40e9899d7799d196b1d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:49:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:49:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:49:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:49:06 localhost podman[303209]: 2026-02-01 09:49:06.717790833 +0000 UTC m=+0.078765852 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, version=9.7, release=1769056855, distribution-scope=public, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:49:06 localhost podman[303209]: 2026-02-01 09:49:06.734466746 +0000 UTC m=+0.095441775 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1769056855, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, version=9.7) Feb 1 04:49:06 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:49:08 localhost nova_compute[274651]: 2026-02-01 09:49:08.236 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:08 localhost nova_compute[274651]: 2026-02-01 09:49:08.238 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:08 localhost nova_compute[274651]: 2026-02-01 09:49:08.239 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:08 localhost nova_compute[274651]: 2026-02-01 09:49:08.239 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:08 localhost nova_compute[274651]: 2026-02-01 09:49:08.255 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:08 localhost nova_compute[274651]: 2026-02-01 09:49:08.256 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:49:12 localhost podman[303228]: 2026-02-01 09:49:12.732194306 +0000 UTC m=+0.090452492 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 04:49:12 localhost podman[303228]: 2026-02-01 09:49:12.768151922 +0000 UTC m=+0.126410158 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:49:12 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:49:13 localhost nova_compute[274651]: 2026-02-01 09:49:13.257 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:13 localhost nova_compute[274651]: 2026-02-01 09:49:13.259 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:13 localhost nova_compute[274651]: 2026-02-01 09:49:13.259 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:13 localhost nova_compute[274651]: 2026-02-01 09:49:13.259 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:13 localhost nova_compute[274651]: 2026-02-01 09:49:13.299 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:13 localhost nova_compute[274651]: 2026-02-01 09:49:13.300 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:18 localhost nova_compute[274651]: 2026-02-01 09:49:18.299 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.498024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359498061, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 940, "num_deletes": 251, "total_data_size": 906193, "memory_usage": 924232, "flush_reason": "Manual Compaction"} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359505671, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 870764, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23035, "largest_seqno": 23974, "table_properties": {"data_size": 866270, "index_size": 2093, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10920, "raw_average_key_size": 21, "raw_value_size": 856975, "raw_average_value_size": 1657, "num_data_blocks": 88, "num_entries": 517, "num_filter_entries": 517, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939301, "oldest_key_time": 1769939301, "file_creation_time": 1769939359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 7718 microseconds, and 3262 cpu microseconds. Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.505738) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 870764 bytes OK Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.505767) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.507688) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.507709) EVENT_LOG_v1 {"time_micros": 1769939359507702, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.507732) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 901594, prev total WAL file size 901594, number of live WAL files 2. Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.508403) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(850KB)], [39(22MB)] Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359508468, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 24541555, "oldest_snapshot_seqno": -1} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 11808 keys, 20524038 bytes, temperature: kUnknown Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359592567, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 20524038, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20457357, "index_size": 36052, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29573, "raw_key_size": 319449, "raw_average_key_size": 27, "raw_value_size": 20256663, "raw_average_value_size": 1715, "num_data_blocks": 1356, "num_entries": 11808, "num_filter_entries": 11808, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.592940) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 20524038 bytes Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.594863) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 291.4 rd, 243.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 22.6 +0.0 blob) out(19.6 +0.0 blob), read-write-amplify(51.8) write-amplify(23.6) OK, records in: 12334, records dropped: 526 output_compression: NoCompression Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.594915) EVENT_LOG_v1 {"time_micros": 1769939359594881, "job": 22, "event": "compaction_finished", "compaction_time_micros": 84229, "compaction_time_cpu_micros": 49608, "output_level": 6, "num_output_files": 1, "total_output_size": 20524038, "num_input_records": 12334, "num_output_records": 11808, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359595254, "job": 22, "event": "table_file_deletion", "file_number": 41} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359598920, "job": 22, "event": "table_file_deletion", "file_number": 39} Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.508300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.599045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.599052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.599055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.599059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:49:19 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:49:19.599062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:49:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0) Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e90 do_prune osdmap full prune enabled Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Activating manager daemon np0005604215.uhhqtv Feb 1 04:49:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 e91: 6 total, 6 up, 6 in Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e91: 6 total, 6 up, 6 in Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e53: np0005604215.uhhqtv(active, starting, since 0.0372602s), standbys: np0005604212.oynhpm, np0005604209.isqrps Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Manager daemon np0005604215.uhhqtv is now available Feb 1 04:49:21 localhost systemd[1]: session-74.scope: Deactivated successfully. Feb 1 04:49:21 localhost systemd[1]: session-74.scope: Consumed 8.679s CPU time. Feb 1 04:49:21 localhost systemd-logind[759]: Session 74 logged out. Waiting for processes to exit. Feb 1 04:49:21 localhost systemd-logind[759]: Removed session 74. Feb 1 04:49:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} v 0) Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} v 0) Feb 1 04:49:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: Activating manager daemon np0005604215.uhhqtv Feb 1 04:49:21 localhost ceph-mon[286721]: from='client.? 172.18.0.200:0/2103452742' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 1 04:49:21 localhost ceph-mon[286721]: Manager daemon np0005604215.uhhqtv is now available Feb 1 04:49:21 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch Feb 1 04:49:21 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch Feb 1 04:49:21 localhost sshd[303247]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:49:21 localhost systemd-logind[759]: New session 75 of user ceph-admin. Feb 1 04:49:21 localhost systemd[1]: Started Session 75 of User ceph-admin. Feb 1 04:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:49:21 localhost podman[303269]: 2026-02-01 09:49:21.930948901 +0000 UTC m=+0.098041516 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:49:21 localhost podman[303269]: 2026-02-01 09:49:21.940570167 +0000 UTC m=+0.107662812 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:49:21 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:49:22 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e54: np0005604215.uhhqtv(active, since 1.10046s), standbys: np0005604212.oynhpm, np0005604209.isqrps Feb 1 04:49:22 localhost podman[303379]: 2026-02-01 09:49:22.757846137 +0000 UTC m=+0.100496571 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , distribution-scope=public) Feb 1 04:49:22 localhost podman[303379]: 2026-02-01 09:49:22.856017336 +0000 UTC m=+0.198667740 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4) Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(cluster) log [INF] : Cluster is now healthy Feb 1 04:49:23 localhost nova_compute[274651]: 2026-02-01 09:49:23.301 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:23 localhost nova_compute[274651]: 2026-02-01 09:49:23.303 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:23 localhost nova_compute[274651]: 2026-02-01 09:49:23.304 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:23 localhost nova_compute[274651]: 2026-02-01 09:49:23.304 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:23 localhost nova_compute[274651]: 2026-02-01 09:49:23.325 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:23 localhost nova_compute[274651]: 2026-02-01 09:49:23.326 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:23 localhost ceph-mon[286721]: [01/Feb/2026:09:49:22] ENGINE Bus STARTING Feb 1 04:49:23 localhost ceph-mon[286721]: [01/Feb/2026:09:49:22] ENGINE Serving on http://172.18.0.108:8765 Feb 1 04:49:23 localhost ceph-mon[286721]: [01/Feb/2026:09:49:23] ENGINE Serving on https://172.18.0.108:7150 Feb 1 04:49:23 localhost ceph-mon[286721]: [01/Feb/2026:09:49:23] ENGINE Bus STARTED Feb 1 04:49:23 localhost ceph-mon[286721]: [01/Feb/2026:09:49:23] ENGINE Client ('172.18.0.108', 45716) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e55: np0005604215.uhhqtv(active, since 2s), standbys: np0005604212.oynhpm, np0005604209.isqrps Feb 1 04:49:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:49:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:49:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:23 localhost podman[236886]: time="2026-02-01T09:49:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:49:23 localhost podman[236886]: @ - - [01/Feb/2026:09:49:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:49:24 localhost podman[236886]: @ - - [01/Feb/2026:09:49:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18818 "" "Go-http-client/1.1" Feb 1 04:49:24 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 1 04:49:24 localhost ceph-mon[286721]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 1 04:49:24 localhost ceph-mon[286721]: Cluster is now healthy Feb 1 04:49:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 1 04:49:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:49:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:49:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:49:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:49:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 1 04:49:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 1 04:49:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:49:25 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:49:25 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:49:25 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:49:25 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:49:25 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf Feb 1 04:49:25 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf Feb 1 04:49:25 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf Feb 1 04:49:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e56: np0005604215.uhhqtv(active, since 4s), standbys: np0005604212.oynhpm, np0005604209.isqrps Feb 1 04:49:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : Standby manager daemon np0005604213.caiaeh started Feb 1 04:49:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e57: np0005604215.uhhqtv(active, since 5s), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh Feb 1 04:49:27 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:49:27 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:49:27 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf Feb 1 04:49:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:49:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:49:27 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:49:28 localhost podman[304280]: 2026-02-01 09:49:28.091116895 +0000 UTC m=+0.084809718 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:49:28 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:49:28 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:49:28 localhost podman[304280]: 2026-02-01 09:49:28.122895493 +0000 UTC m=+0.116588236 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 1 04:49:28 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:49:28 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:49:28 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 1 04:49:28 localhost ceph-mon[286721]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:49:28 localhost ceph-mon[286721]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:49:28 localhost ceph-mon[286721]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring Feb 1 04:49:28 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:49:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:49:28 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:28 localhost nova_compute[274651]: 2026-02-01 09:49:28.327 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:28 localhost nova_compute[274651]: 2026-02-01 09:49:28.329 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:28 localhost nova_compute[274651]: 2026-02-01 09:49:28.329 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:28 localhost nova_compute[274651]: 2026-02-01 09:49:28.329 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:28 localhost nova_compute[274651]: 2026-02-01 09:49:28.365 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:28 localhost nova_compute[274651]: 2026-02-01 09:49:28.366 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:29 localhost ceph-mon[286721]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 1 04:49:29 localhost ceph-mon[286721]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 1 04:49:29 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:49:29 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:49:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:49:29 localhost podman[304318]: 2026-02-01 09:49:29.729956317 +0000 UTC m=+0.083214249 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS) Feb 1 04:49:29 localhost podman[304318]: 2026-02-01 09:49:29.801549239 +0000 UTC m=+0.154807181 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 1 04:49:29 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:49:29 localhost podman[304317]: 2026-02-01 09:49:29.810316148 +0000 UTC m=+0.167396388 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:49:29 localhost podman[304317]: 2026-02-01 09:49:29.890853705 +0000 UTC m=+0.247933975 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:49:29 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:49:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:49:31 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:31 localhost openstack_network_exporter[239441]: ERROR 09:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:49:31 localhost openstack_network_exporter[239441]: Feb 1 04:49:31 localhost openstack_network_exporter[239441]: ERROR 09:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:49:31 localhost openstack_network_exporter[239441]: Feb 1 04:49:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:49:33 localhost nova_compute[274651]: 2026-02-01 09:49:33.367 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:49:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2611289140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:49:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:49:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2611289140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:49:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:49:37 localhost podman[304364]: 2026-02-01 09:49:37.360012788 +0000 UTC m=+0.079925358 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:49:37 localhost podman[304364]: 2026-02-01 09:49:37.371652406 +0000 UTC m=+0.091564946 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, container_name=openstack_network_exporter, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Feb 1 04:49:37 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:49:38 localhost nova_compute[274651]: 2026-02-01 09:49:38.369 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:38 localhost nova_compute[274651]: 2026-02-01 09:49:38.370 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:38 localhost nova_compute[274651]: 2026-02-01 09:49:38.371 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:38 localhost nova_compute[274651]: 2026-02-01 09:49:38.371 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:38 localhost nova_compute[274651]: 2026-02-01 09:49:38.401 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:38 localhost nova_compute[274651]: 2026-02-01 09:49:38.402 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:49:41.713 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:49:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:49:41.713 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:49:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:49:41.714 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:49:43 localhost nova_compute[274651]: 2026-02-01 09:49:43.403 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:43 localhost nova_compute[274651]: 2026-02-01 09:49:43.405 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:43 localhost nova_compute[274651]: 2026-02-01 09:49:43.405 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:43 localhost nova_compute[274651]: 2026-02-01 09:49:43.405 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:43 localhost nova_compute[274651]: 2026-02-01 09:49:43.438 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:43 localhost nova_compute[274651]: 2026-02-01 09:49:43.439 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:49:43 localhost podman[304384]: 2026-02-01 09:49:43.723824485 +0000 UTC m=+0.087154681 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible) Feb 1 04:49:43 localhost podman[304384]: 2026-02-01 09:49:43.734859194 +0000 UTC m=+0.098189470 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 1 04:49:43 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:49:45 localhost nova_compute[274651]: 2026-02-01 09:49:45.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:45 localhost nova_compute[274651]: 2026-02-01 09:49:45.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:49:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:46 localhost nova_compute[274651]: 2026-02-01 09:49:46.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:46 localhost nova_compute[274651]: 2026-02-01 09:49:46.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.291 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.292 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.292 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.293 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.293 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:49:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:49:47 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2378768146' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.765 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.845 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:49:47 localhost nova_compute[274651]: 2026-02-01 09:49:47.849 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.041 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.042 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11432MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.043 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.043 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.219 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.220 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.220 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.265 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.440 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.442 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:49:48 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4268203992' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.705 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.711 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.735 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.736 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:49:48 localhost nova_compute[274651]: 2026-02-01 09:49:48.737 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.737 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.896 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.896 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.897 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.978 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.979 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.979 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:49:50 localhost nova_compute[274651]: 2026-02-01 09:49:50.979 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:49:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:51 localhost nova_compute[274651]: 2026-02-01 09:49:51.598 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:49:51 localhost nova_compute[274651]: 2026-02-01 09:49:51.617 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:49:51 localhost nova_compute[274651]: 2026-02-01 09:49:51.618 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:49:51 localhost nova_compute[274651]: 2026-02-01 09:49:51.618 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:49:52 localhost podman[304448]: 2026-02-01 09:49:52.702170004 +0000 UTC m=+0.062610607 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:49:52 localhost podman[304448]: 2026-02-01 09:49:52.734448486 +0000 UTC m=+0.094889119 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:49:52 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:49:53 localhost sshd[304472]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.441 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.443 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.443 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.443 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.475 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:53 localhost nova_compute[274651]: 2026-02-01 09:49:53.476 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:49:53 localhost podman[236886]: time="2026-02-01T09:49:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:49:53 localhost podman[236886]: @ - - [01/Feb/2026:09:49:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:49:53 localhost podman[236886]: @ - - [01/Feb/2026:09:49:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18812 "" "Go-http-client/1.1" Feb 1 04:49:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:49:58 localhost nova_compute[274651]: 2026-02-01 09:49:58.477 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:58 localhost nova_compute[274651]: 2026-02-01 09:49:58.480 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:49:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:49:58 localhost podman[304474]: 2026-02-01 09:49:58.723459518 +0000 UTC m=+0.085942024 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Feb 1 04:49:58 localhost podman[304474]: 2026-02-01 09:49:58.726693398 +0000 UTC m=+0.089175864 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 04:49:58 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:50:00 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:50:00 localhost ceph-mon[286721]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 04:50:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:50:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:50:00 localhost podman[304492]: 2026-02-01 09:50:00.719077 +0000 UTC m=+0.077674069 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:50:00 localhost podman[304491]: 2026-02-01 09:50:00.780914802 +0000 UTC m=+0.139346626 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:50:00 localhost podman[304491]: 2026-02-01 09:50:00.790500186 +0000 UTC m=+0.148931990 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:50:00 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:50:00 localhost podman[304492]: 2026-02-01 09:50:00.802700491 +0000 UTC m=+0.161297600 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_id=ovn_controller) Feb 1 04:50:00 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:50:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:01 localhost openstack_network_exporter[239441]: ERROR 09:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:50:01 localhost openstack_network_exporter[239441]: Feb 1 04:50:01 localhost openstack_network_exporter[239441]: ERROR 09:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:50:01 localhost openstack_network_exporter[239441]: Feb 1 04:50:03 localhost nova_compute[274651]: 2026-02-01 09:50:03.480 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:50:07 localhost podman[304539]: 2026-02-01 09:50:07.713480076 +0000 UTC m=+0.077654318 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, architecture=x86_64, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:50:07 localhost podman[304539]: 2026-02-01 09:50:07.751240067 +0000 UTC m=+0.115414269 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, distribution-scope=public, vendor=Red Hat, Inc., release=1769056855, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:50:07 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:50:08 localhost nova_compute[274651]: 2026-02-01 09:50:08.483 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:13 localhost nova_compute[274651]: 2026-02-01 09:50:13.485 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:50:14 localhost podman[304559]: 2026-02-01 09:50:14.72249585 +0000 UTC m=+0.085078087 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127) Feb 1 04:50:14 localhost podman[304559]: 2026-02-01 09:50:14.762434908 +0000 UTC m=+0.125017175 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:50:14 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:50:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:18 localhost nova_compute[274651]: 2026-02-01 09:50:18.488 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:23 localhost nova_compute[274651]: 2026-02-01 09:50:23.491 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:50:23 localhost podman[304578]: 2026-02-01 09:50:23.712458356 +0000 UTC m=+0.076759951 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:50:23 localhost podman[304578]: 2026-02-01 09:50:23.74541762 +0000 UTC m=+0.109719215 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:50:23 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:50:23 localhost podman[236886]: time="2026-02-01T09:50:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:50:23 localhost podman[236886]: @ - - [01/Feb/2026:09:50:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:50:24 localhost podman[236886]: @ - - [01/Feb/2026:09:50:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18811 "" "Go-http-client/1.1" Feb 1 04:50:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:28 localhost nova_compute[274651]: 2026-02-01 09:50:28.493 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:50:29 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:50:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:50:29 localhost podman[304690]: 2026-02-01 09:50:29.564409414 +0000 UTC m=+0.091341469 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3) Feb 1 04:50:29 localhost podman[304690]: 2026-02-01 09:50:29.595813649 +0000 UTC m=+0.122745694 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:50:29 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:50:30 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:50:30 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:50:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:50:31 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:50:31 localhost openstack_network_exporter[239441]: ERROR 09:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:50:31 localhost openstack_network_exporter[239441]: Feb 1 04:50:31 localhost openstack_network_exporter[239441]: ERROR 09:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:50:31 localhost openstack_network_exporter[239441]: Feb 1 04:50:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:50:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:50:31 localhost podman[304709]: 2026-02-01 09:50:31.727465133 +0000 UTC m=+0.083546819 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:50:31 localhost podman[304708]: 2026-02-01 09:50:31.783515307 +0000 UTC m=+0.142280205 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:50:31 localhost podman[304709]: 2026-02-01 09:50:31.792137462 +0000 UTC m=+0.148219168 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:50:31 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:50:31 localhost podman[304708]: 2026-02-01 09:50:31.843938575 +0000 UTC m=+0.202703473 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:50:31 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:50:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:50:33 localhost nova_compute[274651]: 2026-02-01 09:50:33.496 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:50:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:50:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3877644669' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:50:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:50:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3877644669' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:50:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:38 localhost nova_compute[274651]: 2026-02-01 09:50:38.498 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:50:38 localhost podman[304756]: 2026-02-01 09:50:38.702497394 +0000 UTC m=+0.067357292 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 1 04:50:38 localhost podman[304756]: 2026-02-01 09:50:38.744747123 +0000 UTC m=+0.109607021 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, version=9.7, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z) Feb 1 04:50:38 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:50:38 localhost sshd[304777]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:50:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:50:41.713 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:50:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:50:41.714 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:50:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:50:41.715 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:50:43 localhost nova_compute[274651]: 2026-02-01 09:50:43.500 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:50:43 localhost nova_compute[274651]: 2026-02-01 09:50:43.502 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:43 localhost nova_compute[274651]: 2026-02-01 09:50:43.502 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:50:43 localhost nova_compute[274651]: 2026-02-01 09:50:43.502 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:50:43 localhost nova_compute[274651]: 2026-02-01 09:50:43.502 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:50:43 localhost nova_compute[274651]: 2026-02-01 09:50:43.504 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:50:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:50:45 localhost podman[304779]: 2026-02-01 09:50:45.707784084 +0000 UTC m=+0.071553201 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:50:45 localhost podman[304779]: 2026-02-01 09:50:45.718979099 +0000 UTC m=+0.082748226 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:50:45 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:50:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:50:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5840 writes, 25K keys, 5840 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5840 writes, 798 syncs, 7.32 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 34 writes, 123 keys, 34 commit groups, 1.0 writes per commit group, ingest: 0.24 MB, 0.00 MB/s#012Interval WAL: 34 writes, 17 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:50:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:47 localhost nova_compute[274651]: 2026-02-01 09:50:47.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:47 localhost nova_compute[274651]: 2026-02-01 09:50:47.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:50:48 localhost nova_compute[274651]: 2026-02-01 09:50:48.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:48 localhost nova_compute[274651]: 2026-02-01 09:50:48.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:48 localhost nova_compute[274651]: 2026-02-01 09:50:48.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:48 localhost nova_compute[274651]: 2026-02-01 09:50:48.503 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:48 localhost nova_compute[274651]: 2026-02-01 09:50:48.505 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.349 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.350 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.350 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.350 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:50:49 localhost nova_compute[274651]: 2026-02-01 09:50:49.821 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:50:49.822 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:50:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:50:49.824 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:50:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:50:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5282 writes, 22K keys, 5282 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5282 writes, 870 syncs, 6.07 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 327 writes, 727 keys, 327 commit groups, 1.0 writes per commit group, ingest: 0.66 MB, 0.00 MB/s#012Interval WAL: 327 writes, 157 syncs, 2.08 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.474 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.490 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.491 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.492 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.492 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.511 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.512 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.512 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.512 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.513 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:50:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:50:50 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/161490111' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:50:50 localhost nova_compute[274651]: 2026-02-01 09:50:50.988 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.065 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.066 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.283 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.285 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11435MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.285 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.286 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:50:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.429 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.430 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.430 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.501 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.560 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.561 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.574 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.597 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI2,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:50:51 localhost nova_compute[274651]: 2026-02-01 09:50:51.651 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:50:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:50:52 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3403118509' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:50:52 localhost nova_compute[274651]: 2026-02-01 09:50:52.102 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:50:52 localhost nova_compute[274651]: 2026-02-01 09:50:52.108 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:50:52 localhost nova_compute[274651]: 2026-02-01 09:50:52.120 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:50:52 localhost nova_compute[274651]: 2026-02-01 09:50:52.123 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:50:52 localhost nova_compute[274651]: 2026-02-01 09:50:52.124 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:50:52 localhost nova_compute[274651]: 2026-02-01 09:50:52.125 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:53 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e58: np0005604215.uhhqtv(active, since 92s), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh Feb 1 04:50:53 localhost nova_compute[274651]: 2026-02-01 09:50:53.506 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:53 localhost nova_compute[274651]: 2026-02-01 09:50:53.508 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:53 localhost podman[236886]: time="2026-02-01T09:50:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:50:53 localhost podman[236886]: @ - - [01/Feb/2026:09:50:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:50:53 localhost podman[236886]: @ - - [01/Feb/2026:09:50:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18820 "" "Go-http-client/1.1" Feb 1 04:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:50:54 localhost podman[304842]: 2026-02-01 09:50:54.690952352 +0000 UTC m=+0.054697633 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:50:54 localhost podman[304842]: 2026-02-01 09:50:54.6996769 +0000 UTC m=+0.063422161 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:50:54 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:50:54 localhost nova_compute[274651]: 2026-02-01 09:50:54.911 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:54 localhost nova_compute[274651]: 2026-02-01 09:50:54.912 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:55 localhost nova_compute[274651]: 2026-02-01 09:50:55.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:55 localhost nova_compute[274651]: 2026-02-01 09:50:55.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 04:50:55 localhost nova_compute[274651]: 2026-02-01 09:50:55.290 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 04:50:55 localhost nova_compute[274651]: 2026-02-01 09:50:55.291 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:50:55 localhost nova_compute[274651]: 2026-02-01 09:50:55.291 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 04:50:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e91 do_prune osdmap full prune enabled Feb 1 04:50:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e92 e92: 6 total, 6 up, 6 in Feb 1 04:50:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in Feb 1 04:50:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:50:55.826 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:50:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:50:58 localhost nova_compute[274651]: 2026-02-01 09:50:58.508 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:58 localhost nova_compute[274651]: 2026-02-01 09:50:58.512 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:50:59 localhost systemd[1]: tmp-crun.d4Mlqd.mount: Deactivated successfully. Feb 1 04:50:59 localhost podman[304865]: 2026-02-01 09:50:59.699605568 +0000 UTC m=+0.060689022 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:50:59 localhost podman[304865]: 2026-02-01 09:50:59.730057857 +0000 UTC m=+0.091141341 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent) Feb 1 04:50:59 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:51:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e92 do_prune osdmap full prune enabled Feb 1 04:51:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 e93: 6 total, 6 up, 6 in Feb 1 04:51:00 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e93: 6 total, 6 up, 6 in Feb 1 04:51:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:01 localhost openstack_network_exporter[239441]: ERROR 09:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:51:01 localhost openstack_network_exporter[239441]: Feb 1 04:51:01 localhost openstack_network_exporter[239441]: ERROR 09:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:51:01 localhost openstack_network_exporter[239441]: Feb 1 04:51:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:51:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:51:02 localhost podman[304886]: 2026-02-01 09:51:02.720861977 +0000 UTC m=+0.077616815 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:51:02 localhost podman[304886]: 2026-02-01 09:51:02.732634699 +0000 UTC m=+0.089389537 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:51:02 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:51:02 localhost podman[304887]: 2026-02-01 09:51:02.784695795 +0000 UTC m=+0.134489558 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:51:02 localhost podman[304887]: 2026-02-01 09:51:02.8195534 +0000 UTC m=+0.169347163 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:51:02 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:51:03 localhost nova_compute[274651]: 2026-02-01 09:51:03.510 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:03 localhost nova_compute[274651]: 2026-02-01 09:51:03.513 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.530 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.530 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.553 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.553 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae98524c-8263-40c2-857e-2d0fcc3d4631', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.531337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '857ecbe0-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '4a6b54bd20a5251665bee9860adcfb135caa09c47d78db0f30c4a3ac5cf4f758'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.531337', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '857ed950-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': 'd25a5bded5f46f57898a99340bb6f0a6dcd6516e625cdf54ce91a1ebaccc8ca3'}]}, 'timestamp': '2026-02-01 09:51:03.553826', '_unique_id': '859bc8e5389a4119b52f3f2481987cfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.554 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.555 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.564 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.564 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c09a392-b6e3-4e21-b480-826d33defe42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.555689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '85807b52-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.77514763, 'message_signature': '4d619eb0b0b91c9542eb2ce78cebc8c1cd9abe19336e6a6a6a7f12d215b1f3ce'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.555689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '858085e8-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.77514763, 'message_signature': '43df83f3de1e8f8812ef0f86afbbe1366fedc069a7c1d3bbfbfbbe535da2f7f9'}]}, 'timestamp': '2026-02-01 09:51:03.564785', '_unique_id': '2730fac35d2e4b0b9fde35b01251b843'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.565 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.566 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.566 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66e1b000-7e4a-4552-85e8-5779af56c162', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.565978', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8580c0f8-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': 'aa2feac0b89100f0dc2f84bdc866d8d637067c0b546b0736686abc8ba07aa86f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.565978', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8580cca6-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '8fdcfe2bb4eb10139f86e79e90296e21ac67343111fc42e03caccaf208f616d4'}]}, 'timestamp': '2026-02-01 09:51:03.566601', '_unique_id': 'a0263a0e1a444b61b97eb1eea1945a16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.567 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.568 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.568 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb2f511e-a0bc-43cf-9ec8-e2844fc95bff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.568080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '85811210-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '4cbdb3f9a8fb795fdf0e067ed0f03810610bcf0a4a71c2ebff36d476d7025f2e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.568080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '85811d46-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '71ff9a42ed295b7ae337b990655e49ad59ed7749b01e0bb903810261ae1db555'}]}, 'timestamp': '2026-02-01 09:51:03.568662', '_unique_id': '2f1c2e2cc7e24a2e86a2f1a7310f81c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.569 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.570 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 14320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71f82ebf-cb2d-45df-bff2-677046b86001', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14320000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:51:03.570109', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8583962a-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.803909767, 'message_signature': 'fca2f4fbf9240a525421e87bf278f3f8fcfcb7f12950326166d1d688028cba5f'}]}, 'timestamp': '2026-02-01 09:51:03.584874', '_unique_id': '24233be21e944ee78e693c65ac448be9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.585 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.586 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.586 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2182f323-b020-43b0-98f9-24481a736123', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.586023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8583cf14-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.77514763, 'message_signature': 'e622cc8dc68b900846a91a40ae753387fa78e9fdb5ed00874489c8b1b5fe4de3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.586023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8583d9dc-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.77514763, 'message_signature': '0e379dc12a6135dfca70cc29573e1e1af9b912e09363fd4977b0627c156157e1'}]}, 'timestamp': '2026-02-01 09:51:03.586598', '_unique_id': '039ebfd4b47a41a196f026a971bc9c36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.590 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '614afb88-26a2-4998-8b16-f30c5a40d554', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.588065', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '8584888c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': '6bb8c300d4dbfcce95129d811d8abdae6e4f24f4d9af3d948a19bb47d36da001'}]}, 'timestamp': '2026-02-01 09:51:03.591225', '_unique_id': '63f902078f764f5ab247af603cff952f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.592 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.592 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a4ec1c6-1d60-41a2-8294-3a9ed3776b92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.592411', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '8584c86a-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': 'f8f45ea0a5c6b87d2645819f14f66034b39a76d4e788853e2e7b1d1d45ada18d'}]}, 'timestamp': '2026-02-01 09:51:03.592720', '_unique_id': 'bf9a1caf39f342b99cac8fb953d4b4ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.594 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.594 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.594 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a4a8010-28e3-4abd-a67b-c34047876b36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.594115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '85850b36-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.77514763, 'message_signature': 'dc2111709624e7367bae3530bdc5ef1ff3947dfb610d89e759fcd57902acc780'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.594115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '85851662-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.77514763, 'message_signature': 'e96c7a1654fe3409f9f91e3d391ce242f4d9df5c144eb7055a84139cb2bb07e2'}]}, 'timestamp': '2026-02-01 09:51:03.594701', '_unique_id': '7f6f753aeff445a583132b0ebc4ea37d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.596 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.596 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.596 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e22a876-265d-47b5-9e89-a9fe18ed3809', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.596253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '85855e4c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '9cba31329a0e17119b24553a1e3193fda3aac6c890d6fc2a4611b5c489e77088'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.596253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8585696e-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '77349d4de3110163fb6926b670d806285a152a325f819beff0767207c55f32fb'}]}, 'timestamp': '2026-02-01 09:51:03.596828', '_unique_id': 'dc792d8458f44f939b60d1b6633cf3ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.598 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.598 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51bd4237-eb09-49aa-b3f7-a26802b839aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.598257', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '8585acf8-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': '2c489cc871a9460ff7a439ddd95adc3e712ac3594b9ca53a5644cef0fbf5d5a8'}]}, 'timestamp': '2026-02-01 09:51:03.598572', '_unique_id': '8338105bb962405796a7f795447dbefc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.599 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.600 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.600 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'beed5aef-fdea-4dae-9a75-0cb54f487538', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.599976', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8585f0e6-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '3c0bb90f1622e718da92daf1d2bd1dacfba009d97c990ac57d7d20e9d4ff59df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.599976', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8585fb7c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '5c46c397325c19b259c98bdc1208ebdcfc2bca5c6614e9cc952142493b5b4b83'}]}, 'timestamp': '2026-02-01 09:51:03.600564', '_unique_id': '068de2af499040c2a927d0d30c51d5ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5b9d7d7-f35d-4bfc-9ffa-4b6acc9b01a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.601962', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '85863e52-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': 'c11a82f00db5fc35094a6c5df0eb1f9a40b34c07576226ca9e195751d05d01a9'}]}, 'timestamp': '2026-02-01 09:51:03.602290', '_unique_id': '41b7c01ebe9248e380a5f4be59a09c38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.603 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f2c8f5f-d1e0-4203-aea1-619ab032d09b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.603679', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '85868088-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': '75ed05ded3a22e887f7bf0ca7e6e2475b6fec1e33a3032f04525f1c5600f944f'}]}, 'timestamp': '2026-02-01 09:51:03.604006', '_unique_id': '81921de3e2114ba99ad0fe225245af89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.605 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.605 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed508d02-01c0-45a5-a9bc-279c52c6b421', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.605389', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '8586c368-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': 'd121000f16217a0451efd8adae9c3009592e0b981986bf7f21585250ddaa9e7b'}]}, 'timestamp': '2026-02-01 09:51:03.605698', '_unique_id': '63f16c89a2aa40449be849386ceabf40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '209bb0d6-2982-4e1f-b2ba-89a61fd4f22f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.607093', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '85870634-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': 'e47380e05198ad51f62e6b9df6b6d30da0182da1a19d5eab521a8fb688360b45'}]}, 'timestamp': '2026-02-01 09:51:03.607408', '_unique_id': 'ff50635f6d27440fa208ddc9d8795d54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.607 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.609 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.609 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ff75a2e-30d2-4fd3-bff2-34eebb065d32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:51:03.609314', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '85875cc4-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '3db89fa22f66035aff1ec0d9d2465812f58404ff7867e1fd73bc6cc22d3f1234'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:51:03.609314', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '858767f0-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.750786919, 'message_signature': '706ce339c07c6a776ec96c89a6af810c65917645d921d405e7f587f2dfe6b7bc'}]}, 'timestamp': '2026-02-01 09:51:03.609896', '_unique_id': '434ad3f906fd4bdda7d5906abca3c047'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.610 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.611 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.611 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac3b3b0d-13c2-46c1-83cb-9eb56f538790', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.611336', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '8587abd4-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': '8f561c997b55debffa99bc0ca575d473a29182add4dcb85b2964223ff50b40fc'}]}, 'timestamp': '2026-02-01 09:51:03.611649', '_unique_id': 'e02c2d462c9d4f2b8d126a7473d960a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.613 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.613 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.613 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd654ad5b-a9ef-4000-82e0-6a687e6671d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.613196', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '8587f440-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': '623cebf79189b6cc8c6f5142d6c7bf3d1d1bdb4918661c9a982f992ed6a6e092'}]}, 'timestamp': '2026-02-01 09:51:03.613503', '_unique_id': 'b3014e8117b44a4892e38e0860ca5e22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76ab4bae-e871-4049-811d-6ddb35a56336', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:51:03.614891', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8588373e-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.803909767, 'message_signature': '060f76d6557e9ef88f50ebd06e50a856c438f1024660315eacb126c74f1335d8'}]}, 'timestamp': '2026-02-01 09:51:03.615210', '_unique_id': '66889bf626a74a2c8b843a94a78d896f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.616 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.616 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7568ffc5-0486-4316-ad4f-bc17056a2222', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:51:03.616577', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '85887870-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11557.807482578, 'message_signature': '82c0ef663d688395f12d6a55edbdd4829fa6c06aff5a36ff10c0b007db633035'}]}, 'timestamp': '2026-02-01 09:51:03.616887', '_unique_id': '2d4dde46df204c989eb1eb5b1acdc4bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.617 12 ERROR oslo_messaging.notify.messaging Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:51:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:51:03.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:51:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:08 localhost nova_compute[274651]: 2026-02-01 09:51:08.513 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:08 localhost nova_compute[274651]: 2026-02-01 09:51:08.517 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:51:09 localhost podman[304933]: 2026-02-01 09:51:09.72479233 +0000 UTC m=+0.086152788 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, vcs-type=git, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.) Feb 1 04:51:09 localhost podman[304933]: 2026-02-01 09:51:09.766478106 +0000 UTC m=+0.127838564 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, distribution-scope=public, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:51:09 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:51:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:13 localhost nova_compute[274651]: 2026-02-01 09:51:13.516 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:13 localhost nova_compute[274651]: 2026-02-01 09:51:13.518 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:51:16 localhost systemd[1]: tmp-crun.94Kybj.mount: Deactivated successfully. Feb 1 04:51:16 localhost podman[304953]: 2026-02-01 09:51:16.738093882 +0000 UTC m=+0.090992167 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute) Feb 1 04:51:16 localhost podman[304953]: 2026-02-01 09:51:16.748842233 +0000 UTC m=+0.101740578 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:51:16 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:51:18 localhost nova_compute[274651]: 2026-02-01 09:51:18.518 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:18 localhost nova_compute[274651]: 2026-02-01 09:51:18.521 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:22 localhost ovn_controller[152492]: 2026-02-01T09:51:22Z|00070|memory_trim|INFO|Detected inactivity (last active 30020 ms ago): trimming memory Feb 1 04:51:23 localhost nova_compute[274651]: 2026-02-01 09:51:23.520 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:23 localhost nova_compute[274651]: 2026-02-01 09:51:23.524 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:23 localhost podman[236886]: time="2026-02-01T09:51:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:51:23 localhost podman[236886]: @ - - [01/Feb/2026:09:51:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:51:24 localhost podman[236886]: @ - - [01/Feb/2026:09:51:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18813 "" "Go-http-client/1.1" Feb 1 04:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:51:25 localhost systemd[1]: tmp-crun.YS40D9.mount: Deactivated successfully. Feb 1 04:51:25 localhost podman[304972]: 2026-02-01 09:51:25.775848454 +0000 UTC m=+0.127530424 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:51:25 localhost podman[304972]: 2026-02-01 09:51:25.813583177 +0000 UTC m=+0.165265127 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:51:25 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:51:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:28 localhost nova_compute[274651]: 2026-02-01 09:51:28.522 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:28 localhost nova_compute[274651]: 2026-02-01 09:51:28.528 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:51:29 localhost podman[305029]: 2026-02-01 09:51:29.888096696 +0000 UTC m=+0.071654901 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent) Feb 1 04:51:29 localhost podman[305029]: 2026-02-01 09:51:29.921435584 +0000 UTC m=+0.104993779 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 04:51:29 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:51:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:51:30 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:51:31 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:51:31 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:51:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:51:31 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:51:31 localhost openstack_network_exporter[239441]: ERROR 09:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:51:31 localhost openstack_network_exporter[239441]: Feb 1 04:51:31 localhost openstack_network_exporter[239441]: ERROR 09:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:51:31 localhost openstack_network_exporter[239441]: Feb 1 04:51:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:51:33 localhost nova_compute[274651]: 2026-02-01 09:51:33.524 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:33 localhost nova_compute[274651]: 2026-02-01 09:51:33.531 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:51:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:51:33 localhost podman[305097]: 2026-02-01 09:51:33.718288971 +0000 UTC m=+0.074370005 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:51:33 localhost podman[305098]: 2026-02-01 09:51:33.777896619 +0000 UTC m=+0.131549828 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller) Feb 1 04:51:33 localhost podman[305097]: 2026-02-01 09:51:33.798702741 +0000 UTC m=+0.154783845 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:51:33 localhost podman[305098]: 2026-02-01 09:51:33.814172188 +0000 UTC m=+0.167825387 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller) Feb 1 04:51:33 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:51:33 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:51:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:38 localhost nova_compute[274651]: 2026-02-01 09:51:38.526 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:38 localhost nova_compute[274651]: 2026-02-01 09:51:38.535 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:51:40 localhost systemd[1]: tmp-crun.Gy1FrC.mount: Deactivated successfully. Feb 1 04:51:40 localhost podman[305145]: 2026-02-01 09:51:40.740929263 +0000 UTC m=+0.088568013 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, build-date=2026-01-22T05:09:47Z, release=1769056855, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:51:40 localhost podman[305145]: 2026-02-01 09:51:40.780191254 +0000 UTC m=+0.127830004 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 1 04:51:40 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:51:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:41.715 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:51:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:41.715 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:51:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:41.716 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:51:43 localhost nova_compute[274651]: 2026-02-01 09:51:43.529 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:43 localhost nova_compute[274651]: 2026-02-01 09:51:43.538 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:45 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:51:45.522 259320 INFO neutron.agent.linux.ip_lib [None req-fdfb20ca-758c-414a-b40d-3edf4ad3d6ea - - - - - -] Device tapd6b66c0b-0e cannot be used as it has no MAC address#033[00m Feb 1 04:51:45 localhost nova_compute[274651]: 2026-02-01 09:51:45.575 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:45 localhost kernel: device tapd6b66c0b-0e entered promiscuous mode Feb 1 04:51:45 localhost ovn_controller[152492]: 2026-02-01T09:51:45Z|00071|binding|INFO|Claiming lport d6b66c0b-0edb-42de-acf8-2a23e46df449 for this chassis. Feb 1 04:51:45 localhost nova_compute[274651]: 2026-02-01 09:51:45.589 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:45 localhost NetworkManager[5964]: [1769939505.5904] manager: (tapd6b66c0b-0e): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Feb 1 04:51:45 localhost ovn_controller[152492]: 2026-02-01T09:51:45Z|00072|binding|INFO|d6b66c0b-0edb-42de-acf8-2a23e46df449: Claiming unknown Feb 1 04:51:45 localhost systemd-udevd[305177]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:51:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:45.601 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-01cb494b-1310-460f-acbe-602aefea39c6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01cb494b-1310-460f-acbe-602aefea39c6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae7d4c2f-1d19-4933-99fa-b8aa62feb38e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d6b66c0b-0edb-42de-acf8-2a23e46df449) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:51:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:45.602 158365 INFO neutron.agent.ovn.metadata.agent [-] Port d6b66c0b-0edb-42de-acf8-2a23e46df449 in datapath 01cb494b-1310-460f-acbe-602aefea39c6 bound to our chassis#033[00m Feb 1 04:51:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:45.603 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5c60457f-ab6d-46c5-9c04-2ef5f0a2722f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:51:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:45.604 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01cb494b-1310-460f-acbe-602aefea39c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:51:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:45.605 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[96dfa590-69ce-4110-bc06-337dd273fa32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:51:45 localhost journal[217584]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, ) Feb 1 04:51:45 localhost journal[217584]: hostname: np0005604212.localdomain Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost ovn_controller[152492]: 2026-02-01T09:51:45Z|00073|binding|INFO|Setting lport d6b66c0b-0edb-42de-acf8-2a23e46df449 ovn-installed in OVS Feb 1 04:51:45 localhost ovn_controller[152492]: 2026-02-01T09:51:45Z|00074|binding|INFO|Setting lport d6b66c0b-0edb-42de-acf8-2a23e46df449 up in Southbound Feb 1 04:51:45 localhost nova_compute[274651]: 2026-02-01 09:51:45.620 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost nova_compute[274651]: 2026-02-01 09:51:45.623 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost nova_compute[274651]: 2026-02-01 09:51:45.650 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:45 localhost journal[217584]: ethtool ioctl error on tapd6b66c0b-0e: No such device Feb 1 04:51:45 localhost nova_compute[274651]: 2026-02-01 09:51:45.675 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:46 localhost nova_compute[274651]: 2026-02-01 09:51:46.393 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:46 localhost podman[305248]: Feb 1 04:51:46 localhost podman[305248]: 2026-02-01 09:51:46.439846591 +0000 UTC m=+0.101234022 container create a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:51:46 localhost podman[305248]: 2026-02-01 09:51:46.390623884 +0000 UTC m=+0.052011355 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:51:46 localhost systemd[1]: Started libpod-conmon-a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32.scope. Feb 1 04:51:46 localhost systemd[1]: Started libcrun container. Feb 1 04:51:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f3642e65e1a67f740d3f6658334c166a76187026a6295041dcedec21bc8a1d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:51:46 localhost podman[305248]: 2026-02-01 09:51:46.526747961 +0000 UTC m=+0.188135392 container init a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:51:46 localhost podman[305248]: 2026-02-01 09:51:46.535204182 +0000 UTC m=+0.196591583 container start a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:51:46 localhost dnsmasq[305266]: started, version 2.85 cachesize 150 Feb 1 04:51:46 localhost dnsmasq[305266]: DNS service limited to local subnets Feb 1 04:51:46 localhost dnsmasq[305266]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:51:46 localhost dnsmasq[305266]: warning: no upstream servers configured Feb 1 04:51:46 localhost dnsmasq-dhcp[305266]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:51:46 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 0 addresses Feb 1 04:51:46 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:51:46 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:51:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:51:46.697 259320 INFO neutron.agent.dhcp.agent [None req-f13a9c22-c6ca-47a9-a512-576525cdf6e8 - - - - - -] DHCP configuration for ports {'6efa26b8-94b4-4ffe-b212-c7bedef06410'} is completed#033[00m Feb 1 04:51:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:51:47 localhost podman[305267]: 2026-02-01 09:51:47.725178839 +0000 UTC m=+0.086477858 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 1 04:51:47 localhost podman[305267]: 2026-02-01 09:51:47.734943129 +0000 UTC m=+0.096242148 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 04:51:47 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:51:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:51:48.251 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:51:47Z, description=, device_id=51127ab4-235a-42b3-a3d8-db85ef96f91a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0ffb335b-5051-4e78-879e-d26b43d36de2, ip_allocation=immediate, mac_address=fa:16:3e:53:7f:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:51:41Z, description=, dns_domain=, id=01cb494b-1310-460f-acbe-602aefea39c6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1791362587-network, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12790, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=107, status=ACTIVE, subnets=['7fc35e97-13d9-40d5-bf5a-be1aeaf1c762'], tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:51:43Z, vlan_transparent=None, network_id=01cb494b-1310-460f-acbe-602aefea39c6, port_security_enabled=False, project_id=ebe5e345d591408fa955b2e811bfaffb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=149, status=DOWN, tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:51:47Z on network 01cb494b-1310-460f-acbe-602aefea39c6#033[00m Feb 1 04:51:48 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 1 addresses Feb 1 04:51:48 localhost podman[305304]: 2026-02-01 09:51:48.48242693 +0000 UTC m=+0.061561450 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:51:48 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:51:48 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:51:48 localhost nova_compute[274651]: 2026-02-01 09:51:48.530 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:48 localhost nova_compute[274651]: 2026-02-01 09:51:48.542 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:51:48.662 259320 INFO neutron.agent.dhcp.agent [None req-347c3f5c-8058-4892-a84b-3fa9b7faf233 - - - - - -] DHCP configuration for ports {'0ffb335b-5051-4e78-879e-d26b43d36de2'} is completed#033[00m Feb 1 04:51:49 localhost nova_compute[274651]: 2026-02-01 09:51:49.309 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:49 localhost nova_compute[274651]: 2026-02-01 09:51:49.309 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:49 localhost nova_compute[274651]: 2026-02-01 09:51:49.310 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:49 localhost nova_compute[274651]: 2026-02-01 09:51:49.310 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:51:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:49.868 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:51:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:49.870 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:51:49 localhost nova_compute[274651]: 2026-02-01 09:51:49.902 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:51:49.936 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:51:47Z, description=, device_id=51127ab4-235a-42b3-a3d8-db85ef96f91a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0ffb335b-5051-4e78-879e-d26b43d36de2, ip_allocation=immediate, mac_address=fa:16:3e:53:7f:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:51:41Z, description=, dns_domain=, id=01cb494b-1310-460f-acbe-602aefea39c6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1791362587-network, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12790, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=107, status=ACTIVE, subnets=['7fc35e97-13d9-40d5-bf5a-be1aeaf1c762'], tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:51:43Z, vlan_transparent=None, network_id=01cb494b-1310-460f-acbe-602aefea39c6, port_security_enabled=False, project_id=ebe5e345d591408fa955b2e811bfaffb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=149, status=DOWN, tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:51:47Z on network 01cb494b-1310-460f-acbe-602aefea39c6#033[00m Feb 1 04:51:50 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 1 addresses Feb 1 04:51:50 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:51:50 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:51:50 localhost podman[305342]: 2026-02-01 09:51:50.165589595 +0000 UTC m=+0.055310457 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:51:50 localhost nova_compute[274651]: 2026-02-01 09:51:50.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:50 localhost nova_compute[274651]: 2026-02-01 09:51:50.267 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:51:50.447 259320 INFO neutron.agent.dhcp.agent [None req-9b1bb18f-0440-4abe-b08c-24fb0b020fd5 - - - - - -] DHCP configuration for ports {'0ffb335b-5051-4e78-879e-d26b43d36de2'} is completed#033[00m Feb 1 04:51:51 localhost nova_compute[274651]: 2026-02-01 09:51:51.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:51 localhost nova_compute[274651]: 2026-02-01 09:51:51.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:51:51 localhost nova_compute[274651]: 2026-02-01 09:51:51.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:51:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.235 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.236 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.236 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.236 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.778 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.804 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.804 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.805 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.806 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.823 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.824 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.824 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.825 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:51:52 localhost nova_compute[274651]: 2026-02-01 09:51:52.825 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:51:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:51:53 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2242716200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.280 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.378 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.379 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.546 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.549 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.629 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.630 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11412MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.630 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.631 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.697 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.697 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.697 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:51:53 localhost nova_compute[274651]: 2026-02-01 09:51:53.756 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:51:53 localhost podman[236886]: time="2026-02-01T09:51:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:51:53 localhost podman[236886]: @ - - [01/Feb/2026:09:51:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158361 "" "Go-http-client/1.1" Feb 1 04:51:54 localhost podman[236886]: @ - - [01/Feb/2026:09:51:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19297 "" "Go-http-client/1.1" Feb 1 04:51:54 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:51:54 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2886766219' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:51:54 localhost nova_compute[274651]: 2026-02-01 09:51:54.225 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:51:54 localhost nova_compute[274651]: 2026-02-01 09:51:54.230 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:51:54 localhost nova_compute[274651]: 2026-02-01 09:51:54.255 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:51:54 localhost nova_compute[274651]: 2026-02-01 09:51:54.258 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:51:54 localhost nova_compute[274651]: 2026-02-01 09:51:54.258 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:51:55 localhost nova_compute[274651]: 2026-02-01 09:51:55.723 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:55 localhost nova_compute[274651]: 2026-02-01 09:51:55.724 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:51:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:51:55.871 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:51:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:51:56 localhost systemd[1]: tmp-crun.bFmJgs.mount: Deactivated successfully. Feb 1 04:51:56 localhost podman[305406]: 2026-02-01 09:51:56.736465546 +0000 UTC m=+0.093397672 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:51:56 localhost podman[305406]: 2026-02-01 09:51:56.746384601 +0000 UTC m=+0.103316657 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:51:56 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:51:57 localhost nova_compute[274651]: 2026-02-01 09:51:57.184 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:51:58 localhost nova_compute[274651]: 2026-02-01 09:51:58.586 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:00 localhost nova_compute[274651]: 2026-02-01 09:52:00.100 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:52:00 localhost podman[305430]: 2026-02-01 09:52:00.736722985 +0000 UTC m=+0.088474980 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 1 04:52:00 localhost podman[305430]: 2026-02-01 09:52:00.769442634 +0000 UTC m=+0.121194679 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Feb 1 04:52:00 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:52:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:01 localhost openstack_network_exporter[239441]: ERROR 09:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:52:01 localhost openstack_network_exporter[239441]: Feb 1 04:52:01 localhost openstack_network_exporter[239441]: ERROR 09:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:52:01 localhost openstack_network_exporter[239441]: Feb 1 04:52:02 localhost neutron_sriov_agent[252126]: 2026-02-01 09:52:02.736 2 INFO neutron.agent.securitygroups_rpc [None req-3ef8af06-8ebd-433a-810c-d499d03d752f 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']#033[00m Feb 1 04:52:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:02.763 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:02Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa, ip_allocation=immediate, mac_address=fa:16:3e:6e:4d:83, name=tempest-parent-377096059, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:51:41Z, description=, dns_domain=, id=01cb494b-1310-460f-acbe-602aefea39c6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-1791362587-network, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12790, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=107, status=ACTIVE, subnets=['7fc35e97-13d9-40d5-bf5a-be1aeaf1c762'], tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:51:43Z, vlan_transparent=None, network_id=01cb494b-1310-460f-acbe-602aefea39c6, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f05aaf36-904c-44ae-a203-34e61744db7d'], standard_attr_id=270, status=DOWN, tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:52:02Z on network 01cb494b-1310-460f-acbe-602aefea39c6#033[00m Feb 1 04:52:02 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 2 addresses Feb 1 04:52:02 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:52:02 localhost podman[305466]: 2026-02-01 09:52:02.978242799 +0000 UTC m=+0.062156948 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:52:02 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:52:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:03.170 259320 INFO neutron.agent.dhcp.agent [None req-b3089da8-5604-45ab-8570-0dc274f36d58 - - - - - -] DHCP configuration for ports {'96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa'} is completed#033[00m Feb 1 04:52:03 localhost nova_compute[274651]: 2026-02-01 09:52:03.589 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:52:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:52:04 localhost podman[305488]: 2026-02-01 09:52:04.704638116 +0000 UTC m=+0.066457851 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:52:04 localhost podman[305487]: 2026-02-01 09:52:04.792734052 +0000 UTC m=+0.157502218 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:52:04 localhost podman[305487]: 2026-02-01 09:52:04.803297778 +0000 UTC m=+0.168065964 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:52:04 localhost podman[305488]: 2026-02-01 09:52:04.814587596 +0000 UTC m=+0.176407351 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Feb 1 04:52:04 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:52:04 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:52:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:52:04.972 2 INFO neutron.agent.securitygroups_rpc [None req-48e73cc9-05d5-4151-9bdf-df0a5d68c81e 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']#033[00m Feb 1 04:52:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.594 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.596 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.596 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.596 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.637 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.638 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:52:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:08.777 259320 INFO neutron.agent.linux.ip_lib [None req-2ce4b783-1f74-4d0b-b497-03e121f90983 - - - - - -] Device tapda717574-ce cannot be used as it has no MAC address#033[00m Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.796 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:08 localhost kernel: device tapda717574-ce entered promiscuous mode Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.804 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:08 localhost NetworkManager[5964]: [1769939528.8054] manager: (tapda717574-ce): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Feb 1 04:52:08 localhost ovn_controller[152492]: 2026-02-01T09:52:08Z|00075|binding|INFO|Claiming lport da717574-ce5f-44e9-87aa-0cb3372649b5 for this chassis. Feb 1 04:52:08 localhost ovn_controller[152492]: 2026-02-01T09:52:08Z|00076|binding|INFO|da717574-ce5f-44e9-87aa-0cb3372649b5: Claiming unknown Feb 1 04:52:08 localhost systemd-udevd[305543]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:52:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:08.821 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ae9a15b0-c845-4791-8b57-ba60c2ee9be3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae9a15b0-c845-4791-8b57-ba60c2ee9be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '60219414a62b42ff9bcc97ac9a1e7265', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59a5d1ff-77ab-4776-a032-56ca10a3d066, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=da717574-ce5f-44e9-87aa-0cb3372649b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:52:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:08.823 158365 INFO neutron.agent.ovn.metadata.agent [-] Port da717574-ce5f-44e9-87aa-0cb3372649b5 in datapath ae9a15b0-c845-4791-8b57-ba60c2ee9be3 bound to our chassis#033[00m Feb 1 04:52:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:08.827 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3df8dc6f-fbc1-48b3-95f1-bccaac2d0b0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:52:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:08.829 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:52:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:08.830 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[063602a5-8d9f-4400-b8f7-87b670aa3c64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost ovn_controller[152492]: 2026-02-01T09:52:08Z|00077|binding|INFO|Setting lport da717574-ce5f-44e9-87aa-0cb3372649b5 ovn-installed in OVS Feb 1 04:52:08 localhost ovn_controller[152492]: 2026-02-01T09:52:08Z|00078|binding|INFO|Setting lport da717574-ce5f-44e9-87aa-0cb3372649b5 up in Southbound Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.837 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost journal[217584]: ethtool ioctl error on tapda717574-ce: No such device Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.869 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:08 localhost nova_compute[274651]: 2026-02-01 09:52:08.883 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:09 localhost podman[305615]: Feb 1 04:52:09 localhost podman[305615]: 2026-02-01 09:52:09.675453604 +0000 UTC m=+0.098652363 container create f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:52:09 localhost systemd[1]: Started libpod-conmon-f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e.scope. Feb 1 04:52:09 localhost podman[305615]: 2026-02-01 09:52:09.627468754 +0000 UTC m=+0.050667563 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:52:09 localhost systemd[1]: Started libcrun container. Feb 1 04:52:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bec80e06f4a638726946aae44bead5594bbc39dab22a379d5f00c3c9d7aec7b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:52:09 localhost podman[305615]: 2026-02-01 09:52:09.744899976 +0000 UTC m=+0.168098735 container init f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:52:09 localhost podman[305615]: 2026-02-01 09:52:09.755456671 +0000 UTC m=+0.178655430 container start f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:52:09 localhost dnsmasq[305633]: started, version 2.85 cachesize 150 Feb 1 04:52:09 localhost dnsmasq[305633]: DNS service limited to local subnets Feb 1 04:52:09 localhost dnsmasq[305633]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:52:09 localhost dnsmasq[305633]: warning: no upstream servers configured Feb 1 04:52:09 localhost dnsmasq-dhcp[305633]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:52:09 localhost dnsmasq[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/addn_hosts - 0 addresses Feb 1 04:52:09 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/host Feb 1 04:52:09 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/opts Feb 1 04:52:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:09.957 259320 INFO neutron.agent.dhcp.agent [None req-b1dd338c-0891-460f-9a0c-c9b177008188 - - - - - -] DHCP configuration for ports {'42aa6619-f5c7-4196-af64-00524778463d'} is completed#033[00m Feb 1 04:52:10 localhost nova_compute[274651]: 2026-02-01 09:52:10.264 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:10 localhost sshd[305634]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:52:10 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:10.731 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:10Z, description=, device_id=f7c12b4a-8bcd-4ee7-8448-2839852e8cba, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=429c7a26-b72e-4182-a47f-3975176902ed, ip_allocation=immediate, mac_address=fa:16:3e:a6:4f:16, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:06Z, description=, dns_domain=, id=ae9a15b0-c845-4791-8b57-ba60c2ee9be3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-661606518-network, port_security_enabled=True, project_id=60219414a62b42ff9bcc97ac9a1e7265, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3974, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=289, status=ACTIVE, subnets=['46e66c9d-66ac-4691-91f0-fbb8a2928ba4'], tags=[], tenant_id=60219414a62b42ff9bcc97ac9a1e7265, updated_at=2026-02-01T09:52:07Z, vlan_transparent=None, network_id=ae9a15b0-c845-4791-8b57-ba60c2ee9be3, port_security_enabled=False, project_id=60219414a62b42ff9bcc97ac9a1e7265, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=332, status=DOWN, tags=[], tenant_id=60219414a62b42ff9bcc97ac9a1e7265, updated_at=2026-02-01T09:52:10Z on network ae9a15b0-c845-4791-8b57-ba60c2ee9be3#033[00m Feb 1 04:52:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:52:10 localhost podman[305662]: 2026-02-01 09:52:10.966270679 +0000 UTC m=+0.073749414 container kill f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:52:10 localhost dnsmasq[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/addn_hosts - 1 addresses Feb 1 04:52:10 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/host Feb 1 04:52:10 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/opts Feb 1 04:52:11 localhost podman[305649]: 2026-02-01 09:52:10.94940728 +0000 UTC m=+0.100144680 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, release=1769056855, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, architecture=x86_64, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc.) Feb 1 04:52:11 localhost podman[305649]: 2026-02-01 09:52:11.035368571 +0000 UTC m=+0.186105961 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, release=1769056855, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 1 04:52:11 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:52:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:11.123 259320 INFO neutron.agent.dhcp.agent [None req-bc6f3f90-9e9f-43cc-b1d4-3d1a666041fc - - - - - -] DHCP configuration for ports {'429c7a26-b72e-4182-a47f-3975176902ed'} is completed#033[00m Feb 1 04:52:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:11.517 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:10Z, description=, device_id=f7c12b4a-8bcd-4ee7-8448-2839852e8cba, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=429c7a26-b72e-4182-a47f-3975176902ed, ip_allocation=immediate, mac_address=fa:16:3e:a6:4f:16, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:06Z, description=, dns_domain=, id=ae9a15b0-c845-4791-8b57-ba60c2ee9be3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-661606518-network, port_security_enabled=True, project_id=60219414a62b42ff9bcc97ac9a1e7265, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3974, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=289, status=ACTIVE, subnets=['46e66c9d-66ac-4691-91f0-fbb8a2928ba4'], tags=[], tenant_id=60219414a62b42ff9bcc97ac9a1e7265, updated_at=2026-02-01T09:52:07Z, vlan_transparent=None, network_id=ae9a15b0-c845-4791-8b57-ba60c2ee9be3, port_security_enabled=False, project_id=60219414a62b42ff9bcc97ac9a1e7265, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=332, status=DOWN, tags=[], tenant_id=60219414a62b42ff9bcc97ac9a1e7265, updated_at=2026-02-01T09:52:10Z on network ae9a15b0-c845-4791-8b57-ba60c2ee9be3#033[00m Feb 1 04:52:11 localhost dnsmasq[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/addn_hosts - 1 addresses Feb 1 04:52:11 localhost podman[305706]: 2026-02-01 09:52:11.734650604 +0000 UTC m=+0.049077604 container kill f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:52:11 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/host Feb 1 04:52:11 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/opts Feb 1 04:52:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:11.884 259320 INFO neutron.agent.dhcp.agent [None req-964c1ddb-26e6-4bcd-b78e-540e18aca9b5 - - - - - -] DHCP configuration for ports {'429c7a26-b72e-4182-a47f-3975176902ed'} is completed#033[00m Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.433649) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533433748, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 2267, "num_deletes": 252, "total_data_size": 4416641, "memory_usage": 4685248, "flush_reason": "Manual Compaction"} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533454832, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 4263172, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23975, "largest_seqno": 26241, "table_properties": {"data_size": 4253747, "index_size": 5866, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 21052, "raw_average_key_size": 21, "raw_value_size": 4234297, "raw_average_value_size": 4307, "num_data_blocks": 251, "num_entries": 983, "num_filter_entries": 983, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939360, "oldest_key_time": 1769939360, "file_creation_time": 1769939533, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 21239 microseconds, and 9870 cpu microseconds. Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.454897) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 4263172 bytes OK Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.454923) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.456792) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.456813) EVENT_LOG_v1 {"time_micros": 1769939533456807, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.456835) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 4407000, prev total WAL file size 4407000, number of live WAL files 2. Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.457788) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(4163KB)], [42(19MB)] Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533457829, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 24787210, "oldest_snapshot_seqno": -1} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12257 keys, 21729282 bytes, temperature: kUnknown Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533548087, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 21729282, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21659171, "index_size": 38394, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 329825, "raw_average_key_size": 26, "raw_value_size": 21450038, "raw_average_value_size": 1750, "num_data_blocks": 1453, "num_entries": 12257, "num_filter_entries": 12257, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939533, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.548638) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 21729282 bytes Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.550311) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 273.8 rd, 240.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 19.6 +0.0 blob) out(20.7 +0.0 blob), read-write-amplify(10.9) write-amplify(5.1) OK, records in: 12791, records dropped: 534 output_compression: NoCompression Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.550343) EVENT_LOG_v1 {"time_micros": 1769939533550328, "job": 24, "event": "compaction_finished", "compaction_time_micros": 90515, "compaction_time_cpu_micros": 52523, "output_level": 6, "num_output_files": 1, "total_output_size": 21729282, "num_input_records": 12791, "num_output_records": 12257, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533551349, "job": 24, "event": "table_file_deletion", "file_number": 44} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533554850, "job": 24, "event": "table_file_deletion", "file_number": 42} Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.457700) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.554958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.554965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.554967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.554969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:52:13 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:52:13.554970) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:52:13 localhost nova_compute[274651]: 2026-02-01 09:52:13.640 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:13 localhost nova_compute[274651]: 2026-02-01 09:52:13.643 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:52:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:16.414 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005604213.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:02Z, description=, device_id=5aefea54-941a-48bf-ad9e-7f13fdfdb4ed, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=tempest-liveautoblockmigrationv225test-server-328365138, extra_dhcp_opts=[], fixed_ips=[], id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa, ip_allocation=immediate, mac_address=fa:16:3e:6e:4d:83, name=tempest-parent-377096059, network_id=01cb494b-1310-460f-acbe-602aefea39c6, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['f05aaf36-904c-44ae-a203-34e61744db7d'], standard_attr_id=270, status=DOWN, tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, trunk_details=sub_ports=[], trunk_id=f67e69ab-d9db-4114-bb34-820d1f8fcee5, updated_at=2026-02-01T09:52:15Z on network 01cb494b-1310-460f-acbe-602aefea39c6#033[00m Feb 1 04:52:16 localhost systemd[1]: tmp-crun.pg5J7z.mount: Deactivated successfully. Feb 1 04:52:16 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 2 addresses Feb 1 04:52:16 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:52:16 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:52:16 localhost podman[305745]: 2026-02-01 09:52:16.70249804 +0000 UTC m=+0.057710411 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:52:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:16.830 259320 INFO neutron.agent.dhcp.agent [None req-f3522a1b-116e-4744-974a-c4b254d99009 - - - - - -] DHCP configuration for ports {'96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa'} is completed#033[00m Feb 1 04:52:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:52:18 localhost nova_compute[274651]: 2026-02-01 09:52:18.642 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:52:18 localhost nova_compute[274651]: 2026-02-01 09:52:18.668 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:52:18 localhost nova_compute[274651]: 2026-02-01 09:52:18.669 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:52:18 localhost nova_compute[274651]: 2026-02-01 09:52:18.669 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:52:18 localhost nova_compute[274651]: 2026-02-01 09:52:18.670 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:18 localhost nova_compute[274651]: 2026-02-01 09:52:18.671 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:52:18 localhost podman[305768]: 2026-02-01 09:52:18.752256149 +0000 UTC m=+0.083137744 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 04:52:18 localhost podman[305768]: 2026-02-01 09:52:18.766366775 +0000 UTC m=+0.097248390 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:52:18 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:52:19 localhost dnsmasq[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/addn_hosts - 0 addresses Feb 1 04:52:19 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/host Feb 1 04:52:19 localhost dnsmasq-dhcp[305633]: read /var/lib/neutron/dhcp/ae9a15b0-c845-4791-8b57-ba60c2ee9be3/opts Feb 1 04:52:19 localhost podman[305804]: 2026-02-01 09:52:19.574836206 +0000 UTC m=+0.067441311 container kill f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:52:19 localhost kernel: device tapda717574-ce left promiscuous mode Feb 1 04:52:19 localhost nova_compute[274651]: 2026-02-01 09:52:19.745 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:19 localhost ovn_controller[152492]: 2026-02-01T09:52:19Z|00079|binding|INFO|Releasing lport da717574-ce5f-44e9-87aa-0cb3372649b5 from this chassis (sb_readonly=0) Feb 1 04:52:19 localhost ovn_controller[152492]: 2026-02-01T09:52:19Z|00080|binding|INFO|Setting lport da717574-ce5f-44e9-87aa-0cb3372649b5 down in Southbound Feb 1 04:52:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:19.769 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ae9a15b0-c845-4791-8b57-ba60c2ee9be3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae9a15b0-c845-4791-8b57-ba60c2ee9be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '60219414a62b42ff9bcc97ac9a1e7265', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59a5d1ff-77ab-4776-a032-56ca10a3d066, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=da717574-ce5f-44e9-87aa-0cb3372649b5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:52:19 localhost nova_compute[274651]: 2026-02-01 09:52:19.769 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:19.772 158365 INFO neutron.agent.ovn.metadata.agent [-] Port da717574-ce5f-44e9-87aa-0cb3372649b5 in datapath ae9a15b0-c845-4791-8b57-ba60c2ee9be3 unbound from our chassis#033[00m Feb 1 04:52:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:19.775 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:52:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:19.777 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0bd325-2c09-45a8-ab4a-08d911b25c4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:52:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:21 localhost ovn_controller[152492]: 2026-02-01T09:52:21Z|00081|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:52:21 localhost nova_compute[274651]: 2026-02-01 09:52:21.917 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:22 localhost dnsmasq[305633]: exiting on receipt of SIGTERM Feb 1 04:52:22 localhost podman[305843]: 2026-02-01 09:52:22.347507189 +0000 UTC m=+0.062461027 container kill f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:52:22 localhost systemd[1]: libpod-f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e.scope: Deactivated successfully. Feb 1 04:52:22 localhost podman[305858]: 2026-02-01 09:52:22.424147522 +0000 UTC m=+0.059924188 container died f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:52:22 localhost podman[305858]: 2026-02-01 09:52:22.457192841 +0000 UTC m=+0.092969467 container cleanup f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:52:22 localhost systemd[1]: libpod-conmon-f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e.scope: Deactivated successfully. Feb 1 04:52:22 localhost podman[305859]: 2026-02-01 09:52:22.506600415 +0000 UTC m=+0.136289314 container remove f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae9a15b0-c845-4791-8b57-ba60c2ee9be3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:52:22 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:22.541 259320 INFO neutron.agent.dhcp.agent [None req-df99dda0-9c1f-48e5-bfd5-7d3c963d881a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:52:22 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:22.596 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:52:23 localhost systemd[1]: var-lib-containers-storage-overlay-bec80e06f4a638726946aae44bead5594bbc39dab22a379d5f00c3c9d7aec7b5-merged.mount: Deactivated successfully. Feb 1 04:52:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f76adbfd056f50f53aaf737c056004b88f1da0091f10f1b3b21c426075da954e-userdata-shm.mount: Deactivated successfully. Feb 1 04:52:23 localhost systemd[1]: run-netns-qdhcp\x2dae9a15b0\x2dc845\x2d4791\x2d8b57\x2dba60c2ee9be3.mount: Deactivated successfully. Feb 1 04:52:23 localhost nova_compute[274651]: 2026-02-01 09:52:23.671 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:23 localhost nova_compute[274651]: 2026-02-01 09:52:23.673 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:23 localhost podman[236886]: time="2026-02-01T09:52:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:52:23 localhost podman[236886]: @ - - [01/Feb/2026:09:52:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158361 "" "Go-http-client/1.1" Feb 1 04:52:24 localhost podman[236886]: @ - - [01/Feb/2026:09:52:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19306 "" "Go-http-client/1.1" Feb 1 04:52:25 localhost nova_compute[274651]: 2026-02-01 09:52:25.578 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:26 localhost nova_compute[274651]: 2026-02-01 09:52:26.388 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:52:27 localhost podman[305889]: 2026-02-01 09:52:27.720064477 +0000 UTC m=+0.081520525 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:52:27 localhost podman[305889]: 2026-02-01 09:52:27.733299354 +0000 UTC m=+0.094755332 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:52:27 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:52:28 localhost nova_compute[274651]: 2026-02-01 09:52:28.727 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:30 localhost nova_compute[274651]: 2026-02-01 09:52:30.007 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:30 localhost nova_compute[274651]: 2026-02-01 09:52:30.583 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:52:31 localhost podman[305930]: 2026-02-01 09:52:31.029676507 +0000 UTC m=+0.086415456 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:52:31 localhost podman[305930]: 2026-02-01 09:52:31.066786331 +0000 UTC m=+0.123525340 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:52:31 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:52:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:31 localhost openstack_network_exporter[239441]: ERROR 09:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:52:31 localhost openstack_network_exporter[239441]: Feb 1 04:52:31 localhost openstack_network_exporter[239441]: ERROR 09:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:52:31 localhost openstack_network_exporter[239441]: Feb 1 04:52:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:52:31 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:52:32 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:52:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.732 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e93 do_prune osdmap full prune enabled Feb 1 04:52:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e94 e94: 6 total, 6 up, 6 in Feb 1 04:52:33 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e94: 6 total, 6 up, 6 in Feb 1 04:52:33 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:33.852 259320 INFO neutron.agent.linux.ip_lib [None req-f8c94d92-64a3-46f3-9017-96b8576f7396 - - - - - -] Device tap377a7a10-80 cannot be used as it has no MAC address#033[00m Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.877 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:33 localhost kernel: device tap377a7a10-80 entered promiscuous mode Feb 1 04:52:33 localhost NetworkManager[5964]: [1769939553.8936] manager: (tap377a7a10-80): new Generic device (/org/freedesktop/NetworkManager/Devices/21) Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.892 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:33 localhost ovn_controller[152492]: 2026-02-01T09:52:33Z|00082|binding|INFO|Claiming lport 377a7a10-8059-48da-a5cf-af24631e9999 for this chassis. Feb 1 04:52:33 localhost ovn_controller[152492]: 2026-02-01T09:52:33Z|00083|binding|INFO|377a7a10-8059-48da-a5cf-af24631e9999: Claiming unknown Feb 1 04:52:33 localhost systemd-udevd[306027]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:52:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:33.908 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-f1c9407b-b174-4a64-ba99-055b15628d3f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1c9407b-b174-4a64-ba99-055b15628d3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef9394e0b21548a491d64bf76f5f6368', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2de141c9-8cc9-4b0e-ba0d-113cc86928d7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=377a7a10-8059-48da-a5cf-af24631e9999) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:52:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:33.910 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 377a7a10-8059-48da-a5cf-af24631e9999 in datapath f1c9407b-b174-4a64-ba99-055b15628d3f bound to our chassis#033[00m Feb 1 04:52:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:33.911 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f1c9407b-b174-4a64-ba99-055b15628d3f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:52:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:33.912 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[b0caaf77-6824-4a16-ab3f-be7284d5bca7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost ovn_controller[152492]: 2026-02-01T09:52:33Z|00084|binding|INFO|Setting lport 377a7a10-8059-48da-a5cf-af24631e9999 ovn-installed in OVS Feb 1 04:52:33 localhost ovn_controller[152492]: 2026-02-01T09:52:33Z|00085|binding|INFO|Setting lport 377a7a10-8059-48da-a5cf-af24631e9999 up in Southbound Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.928 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.931 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost journal[217584]: ethtool ioctl error on tap377a7a10-80: No such device Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.965 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:33 localhost nova_compute[274651]: 2026-02-01 09:52:33.993 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:52:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2572148168' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:52:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:52:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2572148168' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:52:34 localhost podman[306098]: Feb 1 04:52:34 localhost podman[306098]: 2026-02-01 09:52:34.944540662 +0000 UTC m=+0.080224035 container create 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:52:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:52:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:52:34 localhost systemd[1]: Started libpod-conmon-431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c.scope. Feb 1 04:52:35 localhost systemd[1]: Started libcrun container. Feb 1 04:52:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6401b23b324e1ba0152b3e13ee96bde9897fa8619d4a073063f4820d9dc3d43e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:52:35 localhost podman[306098]: 2026-02-01 09:52:34.908718227 +0000 UTC m=+0.044401620 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:52:35 localhost podman[306098]: 2026-02-01 09:52:35.016718267 +0000 UTC m=+0.152401710 container init 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:52:35 localhost podman[306098]: 2026-02-01 09:52:35.03367261 +0000 UTC m=+0.169355993 container start 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:52:35 localhost dnsmasq[306140]: started, version 2.85 cachesize 150 Feb 1 04:52:35 localhost dnsmasq[306140]: DNS service limited to local subnets Feb 1 04:52:35 localhost dnsmasq[306140]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:52:35 localhost dnsmasq[306140]: warning: no upstream servers configured Feb 1 04:52:35 localhost dnsmasq-dhcp[306140]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:52:35 localhost dnsmasq[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/addn_hosts - 0 addresses Feb 1 04:52:35 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/host Feb 1 04:52:35 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/opts Feb 1 04:52:35 localhost podman[306113]: 2026-02-01 09:52:35.099055157 +0000 UTC m=+0.109682274 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:52:35 localhost podman[306113]: 2026-02-01 09:52:35.175238466 +0000 UTC m=+0.185865593 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:52:35 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:52:35 localhost podman[306112]: 2026-02-01 09:52:35.185070308 +0000 UTC m=+0.193945451 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:52:35 localhost podman[306112]: 2026-02-01 09:52:35.194267243 +0000 UTC m=+0.203142386 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:52:35 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:52:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:35.319 259320 INFO neutron.agent.dhcp.agent [None req-1dfe6e79-266d-4c42-8bb7-9f54e8d20444 - - - - - -] DHCP configuration for ports {'8cddfd60-04be-4291-831e-91f305a85c49'} is completed#033[00m Feb 1 04:52:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e94 do_prune osdmap full prune enabled Feb 1 04:52:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e95 e95: 6 total, 6 up, 6 in Feb 1 04:52:35 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e95: 6 total, 6 up, 6 in Feb 1 04:52:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 04:52:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 3126 writes, 26K keys, 3126 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.10 MB/s#012Cumulative WAL: 3126 writes, 3126 syncs, 1.00 writes per sync, written: 0.06 GB, 0.10 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3126 writes, 26K keys, 3126 commit groups, 1.0 writes per commit group, ingest: 60.70 MB, 0.10 MB/s#012Interval WAL: 3126 writes, 3126 syncs, 1.00 writes per sync, written: 0.06 GB, 0.10 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 56.5 0.76 0.11 12 0.063 0 0 0.0 0.0#012 L6 1/0 20.72 MB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 5.1 197.9 179.6 1.21 0.55 11 0.110 127K 5539 0.0 0.0#012 Sum 1/0 20.72 MB 0.0 0.2 0.0 0.2 0.3 0.1 0.0 6.1 121.6 132.2 1.96 0.65 23 0.085 127K 5539 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.3 0.1 0.0 6.1 121.7 132.3 1.96 0.65 22 0.089 127K 5539 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 0.0 197.9 179.6 1.21 0.55 11 0.110 127K 5539 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 56.6 0.75 0.11 11 0.069 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.042, interval 0.042#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.43 MB/s write, 0.23 GB read, 0.40 MB/s read, 2.0 seconds#012Interval compaction: 0.25 GB write, 0.43 MB/s write, 0.23 GB read, 0.40 MB/s read, 2.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558a91fa9350#2 capacity: 308.00 MB usage: 22.76 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000149 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(951,21.89 MB,7.10655%) FilterBlock(23,380.17 KB,0.120539%) IndexBlock(23,509.53 KB,0.161555%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 1 04:52:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:52:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:52:36 localhost nova_compute[274651]: 2026-02-01 09:52:36.853 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:52:37 localhost ovn_controller[152492]: 2026-02-01T09:52:37Z|00086|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:52:37 localhost nova_compute[274651]: 2026-02-01 09:52:37.337 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:37 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:37.853 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:37Z, description=, device_id=23b109e5-0b85-4fa5-97f6-89d2144354c5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6979e83a-da21-4dcc-b19a-a99fb1e67032, ip_allocation=immediate, mac_address=fa:16:3e:6b:e4:8a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:31Z, description=, dns_domain=, id=f1c9407b-b174-4a64-ba99-055b15628d3f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-933816886-network, port_security_enabled=True, project_id=ef9394e0b21548a491d64bf76f5f6368, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63793, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=497, status=ACTIVE, subnets=['63c9414d-6ef7-416f-b1b2-fe6269d8c02a'], tags=[], tenant_id=ef9394e0b21548a491d64bf76f5f6368, updated_at=2026-02-01T09:52:33Z, vlan_transparent=None, network_id=f1c9407b-b174-4a64-ba99-055b15628d3f, port_security_enabled=False, project_id=ef9394e0b21548a491d64bf76f5f6368, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=525, status=DOWN, tags=[], tenant_id=ef9394e0b21548a491d64bf76f5f6368, updated_at=2026-02-01T09:52:37Z on network f1c9407b-b174-4a64-ba99-055b15628d3f#033[00m Feb 1 04:52:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e95 do_prune osdmap full prune enabled Feb 1 04:52:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e96 e96: 6 total, 6 up, 6 in Feb 1 04:52:37 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e96: 6 total, 6 up, 6 in Feb 1 04:52:38 localhost podman[306182]: 2026-02-01 09:52:38.14484667 +0000 UTC m=+0.063303963 container kill 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 1 04:52:38 localhost dnsmasq[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/addn_hosts - 1 addresses Feb 1 04:52:38 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/host Feb 1 04:52:38 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/opts Feb 1 04:52:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:38.367 259320 INFO neutron.agent.dhcp.agent [None req-a154039c-5ba6-477d-ad5d-bc98ce45f3ac - - - - - -] DHCP configuration for ports {'6979e83a-da21-4dcc-b19a-a99fb1e67032'} is completed#033[00m Feb 1 04:52:38 localhost nova_compute[274651]: 2026-02-01 09:52:38.767 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e96 do_prune osdmap full prune enabled Feb 1 04:52:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e97 e97: 6 total, 6 up, 6 in Feb 1 04:52:38 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e97: 6 total, 6 up, 6 in Feb 1 04:52:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:38.961 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:37Z, description=, device_id=23b109e5-0b85-4fa5-97f6-89d2144354c5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6979e83a-da21-4dcc-b19a-a99fb1e67032, ip_allocation=immediate, mac_address=fa:16:3e:6b:e4:8a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:31Z, description=, dns_domain=, id=f1c9407b-b174-4a64-ba99-055b15628d3f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-933816886-network, port_security_enabled=True, project_id=ef9394e0b21548a491d64bf76f5f6368, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63793, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=497, status=ACTIVE, subnets=['63c9414d-6ef7-416f-b1b2-fe6269d8c02a'], tags=[], tenant_id=ef9394e0b21548a491d64bf76f5f6368, updated_at=2026-02-01T09:52:33Z, vlan_transparent=None, network_id=f1c9407b-b174-4a64-ba99-055b15628d3f, port_security_enabled=False, project_id=ef9394e0b21548a491d64bf76f5f6368, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=525, status=DOWN, tags=[], tenant_id=ef9394e0b21548a491d64bf76f5f6368, updated_at=2026-02-01T09:52:37Z on network f1c9407b-b174-4a64-ba99-055b15628d3f#033[00m Feb 1 04:52:39 localhost podman[306218]: 2026-02-01 09:52:39.167690442 +0000 UTC m=+0.062074755 container kill 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:52:39 localhost dnsmasq[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/addn_hosts - 1 addresses Feb 1 04:52:39 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/host Feb 1 04:52:39 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/opts Feb 1 04:52:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:39.427 259320 INFO neutron.agent.dhcp.agent [None req-f2b42d6a-7fca-4c3d-868b-3688a9bd0768 - - - - - -] DHCP configuration for ports {'6979e83a-da21-4dcc-b19a-a99fb1e67032'} is completed#033[00m Feb 1 04:52:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e97 do_prune osdmap full prune enabled Feb 1 04:52:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e98 e98: 6 total, 6 up, 6 in Feb 1 04:52:41 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e98: 6 total, 6 up, 6 in Feb 1 04:52:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:52:41 localhost podman[306239]: 2026-02-01 09:52:41.707129611 +0000 UTC m=+0.066363308 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, version=9.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.) Feb 1 04:52:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:41.715 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:52:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:41.716 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:52:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:41.716 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:52:41 localhost podman[306239]: 2026-02-01 09:52:41.747581528 +0000 UTC m=+0.106815235 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:52:41 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:52:41 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:41.843 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:02Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa, ip_allocation=immediate, mac_address=fa:16:3e:6e:4d:83, name=tempest-parent-377096059, network_id=01cb494b-1310-460f-acbe-602aefea39c6, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=14, security_groups=['f05aaf36-904c-44ae-a203-34e61744db7d'], standard_attr_id=270, status=DOWN, tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, trunk_details=sub_ports=[], trunk_id=f67e69ab-d9db-4114-bb34-820d1f8fcee5, updated_at=2026-02-01T09:52:41Z on network 01cb494b-1310-460f-acbe-602aefea39c6#033[00m Feb 1 04:52:42 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 2 addresses Feb 1 04:52:42 localhost podman[306277]: 2026-02-01 09:52:42.094136486 +0000 UTC m=+0.048582620 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Feb 1 04:52:42 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:52:42 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:52:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:42.263 259320 INFO neutron.agent.dhcp.agent [None req-4f12d18e-eb9a-4656-b8b9-48b2b6f5cfaf - - - - - -] DHCP configuration for ports {'96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa'} is completed#033[00m Feb 1 04:52:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:52:42.618 2 INFO neutron.agent.securitygroups_rpc [None req-c376665f-557d-4fc0-a2ce-3b9dbb425e99 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']#033[00m Feb 1 04:52:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:43.559 259320 INFO neutron.agent.linux.ip_lib [None req-ee92d0b2-6a42-4772-8c6e-31e61e0c7859 - - - - - -] Device tape817747d-24 cannot be used as it has no MAC address#033[00m Feb 1 04:52:43 localhost nova_compute[274651]: 2026-02-01 09:52:43.583 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:43 localhost kernel: device tape817747d-24 entered promiscuous mode Feb 1 04:52:43 localhost nova_compute[274651]: 2026-02-01 09:52:43.593 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:43 localhost ovn_controller[152492]: 2026-02-01T09:52:43Z|00087|binding|INFO|Claiming lport e817747d-24b7-49b2-85e1-ef82dbfc6eed for this chassis. Feb 1 04:52:43 localhost ovn_controller[152492]: 2026-02-01T09:52:43Z|00088|binding|INFO|e817747d-24b7-49b2-85e1-ef82dbfc6eed: Claiming unknown Feb 1 04:52:43 localhost NetworkManager[5964]: [1769939563.5969] manager: (tape817747d-24): new Generic device (/org/freedesktop/NetworkManager/Devices/22) Feb 1 04:52:43 localhost systemd-udevd[306309]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:52:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:43.606 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-86002929-7586-4966-8ae7-9d2dade18982', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86002929-7586-4966-8ae7-9d2dade18982', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ae7782b6c19412abff1de327bcab6bf', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa27c594-ab1a-4757-b206-f23ee7c3767c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e817747d-24b7-49b2-85e1-ef82dbfc6eed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:52:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:43.610 158365 INFO neutron.agent.ovn.metadata.agent [-] Port e817747d-24b7-49b2-85e1-ef82dbfc6eed in datapath 86002929-7586-4966-8ae7-9d2dade18982 bound to our chassis#033[00m Feb 1 04:52:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:43.614 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4f60c72d-31b4-4794-86d3-586ce8857f1b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:52:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:43.615 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 86002929-7586-4966-8ae7-9d2dade18982, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:52:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:43.616 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[7f8def28-11cc-441f-b689-c923beaadadf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:52:43 localhost nova_compute[274651]: 2026-02-01 09:52:43.641 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:43 localhost ovn_controller[152492]: 2026-02-01T09:52:43Z|00089|binding|INFO|Setting lport e817747d-24b7-49b2-85e1-ef82dbfc6eed ovn-installed in OVS Feb 1 04:52:43 localhost ovn_controller[152492]: 2026-02-01T09:52:43Z|00090|binding|INFO|Setting lport e817747d-24b7-49b2-85e1-ef82dbfc6eed up in Southbound Feb 1 04:52:43 localhost nova_compute[274651]: 2026-02-01 09:52:43.642 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:43 localhost nova_compute[274651]: 2026-02-01 09:52:43.647 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:43 localhost nova_compute[274651]: 2026-02-01 09:52:43.673 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:43 localhost nova_compute[274651]: 2026-02-01 09:52:43.772 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:44 localhost podman[306362]: Feb 1 04:52:44 localhost podman[306362]: 2026-02-01 09:52:44.503406072 +0000 UTC m=+0.092625328 container create ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:52:44 localhost systemd[1]: Started libpod-conmon-ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104.scope. Feb 1 04:52:44 localhost podman[306362]: 2026-02-01 09:52:44.45795173 +0000 UTC m=+0.047171026 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:52:44 localhost systemd[1]: Started libcrun container. Feb 1 04:52:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73bd4b69fa62e7385f2f2d34bf53f8c24bd423a6199c9208614b953ec4148b92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:52:44 localhost podman[306362]: 2026-02-01 09:52:44.574359339 +0000 UTC m=+0.163578565 container init ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:52:44 localhost podman[306362]: 2026-02-01 09:52:44.584989117 +0000 UTC m=+0.174208373 container start ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:52:44 localhost dnsmasq[306380]: started, version 2.85 cachesize 150 Feb 1 04:52:44 localhost dnsmasq[306380]: DNS service limited to local subnets Feb 1 04:52:44 localhost dnsmasq[306380]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:52:44 localhost dnsmasq[306380]: warning: no upstream servers configured Feb 1 04:52:44 localhost dnsmasq-dhcp[306380]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:52:44 localhost dnsmasq[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/addn_hosts - 0 addresses Feb 1 04:52:44 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/host Feb 1 04:52:44 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/opts Feb 1 04:52:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:44.757 259320 INFO neutron.agent.dhcp.agent [None req-83d622a7-4f72-4d10-a1c7-0a10fe647e55 - - - - - -] DHCP configuration for ports {'9cff9a2d-2166-42c0-878c-42c799a53c16'} is completed#033[00m Feb 1 04:52:44 localhost neutron_sriov_agent[252126]: 2026-02-01 09:52:44.870 2 INFO neutron.agent.securitygroups_rpc [None req-651e6fa7-c546-4929-9315-764ba9e33bc3 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']#033[00m Feb 1 04:52:45 localhost nova_compute[274651]: 2026-02-01 09:52:45.079 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:46.204 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:45Z, description=, device_id=bbfbf3e9-aa2c-46e8-8fba-b8a5cad0c801, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=492ff0e7-d03b-43dd-a8fb-4a9a7e1fd6dd, ip_allocation=immediate, mac_address=fa:16:3e:59:48:fe, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:40Z, description=, dns_domain=, id=86002929-7586-4966-8ae7-9d2dade18982, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-574697347-network, port_security_enabled=True, project_id=7ae7782b6c19412abff1de327bcab6bf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30651, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=541, status=ACTIVE, subnets=['3747f4bd-5591-4718-8fac-342845066dfe'], tags=[], tenant_id=7ae7782b6c19412abff1de327bcab6bf, updated_at=2026-02-01T09:52:41Z, vlan_transparent=None, network_id=86002929-7586-4966-8ae7-9d2dade18982, port_security_enabled=False, project_id=7ae7782b6c19412abff1de327bcab6bf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=580, status=DOWN, tags=[], tenant_id=7ae7782b6c19412abff1de327bcab6bf, updated_at=2026-02-01T09:52:45Z on network 86002929-7586-4966-8ae7-9d2dade18982#033[00m Feb 1 04:52:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e98 do_prune osdmap full prune enabled Feb 1 04:52:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e99 e99: 6 total, 6 up, 6 in Feb 1 04:52:46 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e99: 6 total, 6 up, 6 in Feb 1 04:52:46 localhost dnsmasq[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/addn_hosts - 1 addresses Feb 1 04:52:46 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/host Feb 1 04:52:46 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/opts Feb 1 04:52:46 localhost podman[306398]: 2026-02-01 09:52:46.440161157 +0000 UTC m=+0.055478532 container kill ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:52:46 localhost systemd[1]: tmp-crun.mjnN0T.mount: Deactivated successfully. Feb 1 04:52:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:46.635 259320 INFO neutron.agent.dhcp.agent [None req-57993243-63b8-42c7-9441-38a6ca4b6be5 - - - - - -] DHCP configuration for ports {'492ff0e7-d03b-43dd-a8fb-4a9a7e1fd6dd'} is completed#033[00m Feb 1 04:52:46 localhost neutron_sriov_agent[252126]: 2026-02-01 09:52:46.678 2 INFO neutron.agent.securitygroups_rpc [None req-2d42c471-7b15-4400-a993-fcf3849484f7 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']#033[00m Feb 1 04:52:47 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:47.147 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:45Z, description=, device_id=bbfbf3e9-aa2c-46e8-8fba-b8a5cad0c801, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=492ff0e7-d03b-43dd-a8fb-4a9a7e1fd6dd, ip_allocation=immediate, mac_address=fa:16:3e:59:48:fe, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:40Z, description=, dns_domain=, id=86002929-7586-4966-8ae7-9d2dade18982, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-574697347-network, port_security_enabled=True, project_id=7ae7782b6c19412abff1de327bcab6bf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30651, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=541, status=ACTIVE, subnets=['3747f4bd-5591-4718-8fac-342845066dfe'], tags=[], tenant_id=7ae7782b6c19412abff1de327bcab6bf, updated_at=2026-02-01T09:52:41Z, vlan_transparent=None, network_id=86002929-7586-4966-8ae7-9d2dade18982, port_security_enabled=False, project_id=7ae7782b6c19412abff1de327bcab6bf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=580, status=DOWN, tags=[], tenant_id=7ae7782b6c19412abff1de327bcab6bf, updated_at=2026-02-01T09:52:45Z on network 86002929-7586-4966-8ae7-9d2dade18982#033[00m Feb 1 04:52:47 localhost dnsmasq[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/addn_hosts - 1 addresses Feb 1 04:52:47 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/host Feb 1 04:52:47 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/opts Feb 1 04:52:47 localhost podman[306438]: 2026-02-01 09:52:47.374719746 +0000 UTC m=+0.061468486 container kill ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:52:47 localhost neutron_sriov_agent[252126]: 2026-02-01 09:52:47.415 2 INFO neutron.agent.securitygroups_rpc [None req-0fc1e61c-d2d8-4451-b527-5803b4fad28d 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']#033[00m Feb 1 04:52:47 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:47.590 259320 INFO neutron.agent.dhcp.agent [None req-e1f19a8c-8712-46fb-8661-c7fe1565fa88 - - - - - -] DHCP configuration for ports {'492ff0e7-d03b-43dd-a8fb-4a9a7e1fd6dd'} is completed#033[00m Feb 1 04:52:47 localhost podman[306475]: 2026-02-01 09:52:47.636506199 +0000 UTC m=+0.055558395 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:52:47 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 1 addresses Feb 1 04:52:47 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:52:47 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:52:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e99 do_prune osdmap full prune enabled Feb 1 04:52:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e100 e100: 6 total, 6 up, 6 in Feb 1 04:52:48 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e100: 6 total, 6 up, 6 in Feb 1 04:52:48 localhost nova_compute[274651]: 2026-02-01 09:52:48.818 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:48 localhost podman[306514]: 2026-02-01 09:52:48.86198723 +0000 UTC m=+0.093400162 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:52:48 localhost dnsmasq[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/addn_hosts - 0 addresses Feb 1 04:52:48 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/host Feb 1 04:52:48 localhost dnsmasq-dhcp[305266]: read /var/lib/neutron/dhcp/01cb494b-1310-460f-acbe-602aefea39c6/opts Feb 1 04:52:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:52:48 localhost systemd[1]: tmp-crun.ieaJSZ.mount: Deactivated successfully. Feb 1 04:52:48 localhost podman[306529]: 2026-02-01 09:52:48.970574418 +0000 UTC m=+0.084087574 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:52:48 localhost podman[306529]: 2026-02-01 09:52:48.983343912 +0000 UTC m=+0.096857068 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 1 04:52:48 localhost kernel: device tapd6b66c0b-0e left promiscuous mode Feb 1 04:52:48 localhost nova_compute[274651]: 2026-02-01 09:52:48.984 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:48 localhost ovn_controller[152492]: 2026-02-01T09:52:48Z|00091|binding|INFO|Releasing lport d6b66c0b-0edb-42de-acf8-2a23e46df449 from this chassis (sb_readonly=0) Feb 1 04:52:48 localhost ovn_controller[152492]: 2026-02-01T09:52:48Z|00092|binding|INFO|Setting lport d6b66c0b-0edb-42de-acf8-2a23e46df449 down in Southbound Feb 1 04:52:48 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:52:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:49.000 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-01cb494b-1310-460f-acbe-602aefea39c6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01cb494b-1310-460f-acbe-602aefea39c6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae7d4c2f-1d19-4933-99fa-b8aa62feb38e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d6b66c0b-0edb-42de-acf8-2a23e46df449) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:52:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:49.003 158365 INFO neutron.agent.ovn.metadata.agent [-] Port d6b66c0b-0edb-42de-acf8-2a23e46df449 in datapath 01cb494b-1310-460f-acbe-602aefea39c6 unbound from our chassis#033[00m Feb 1 04:52:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:49.006 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01cb494b-1310-460f-acbe-602aefea39c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:52:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:49.007 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[97327284-bedb-4324-a10c-6ed2ae100b79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:52:49 localhost nova_compute[274651]: 2026-02-01 09:52:49.009 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:49 localhost nova_compute[274651]: 2026-02-01 09:52:49.010 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:49 localhost nova_compute[274651]: 2026-02-01 09:52:49.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:50 localhost nova_compute[274651]: 2026-02-01 09:52:50.265 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:50 localhost nova_compute[274651]: 2026-02-01 09:52:50.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:50 localhost nova_compute[274651]: 2026-02-01 09:52:50.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:52:50 localhost dnsmasq[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/addn_hosts - 0 addresses Feb 1 04:52:50 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/host Feb 1 04:52:50 localhost podman[306573]: 2026-02-01 09:52:50.387746841 +0000 UTC m=+0.059279449 container kill ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:52:50 localhost dnsmasq-dhcp[306380]: read /var/lib/neutron/dhcp/86002929-7586-4966-8ae7-9d2dade18982/opts Feb 1 04:52:50 localhost nova_compute[274651]: 2026-02-01 09:52:50.505 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:50.505 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:52:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:50.507 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:52:50 localhost nova_compute[274651]: 2026-02-01 09:52:50.565 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:50 localhost ovn_controller[152492]: 2026-02-01T09:52:50Z|00093|binding|INFO|Releasing lport e817747d-24b7-49b2-85e1-ef82dbfc6eed from this chassis (sb_readonly=0) Feb 1 04:52:50 localhost kernel: device tape817747d-24 left promiscuous mode Feb 1 04:52:50 localhost ovn_controller[152492]: 2026-02-01T09:52:50Z|00094|binding|INFO|Setting lport e817747d-24b7-49b2-85e1-ef82dbfc6eed down in Southbound Feb 1 04:52:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:50.577 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-86002929-7586-4966-8ae7-9d2dade18982', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86002929-7586-4966-8ae7-9d2dade18982', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ae7782b6c19412abff1de327bcab6bf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa27c594-ab1a-4757-b206-f23ee7c3767c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e817747d-24b7-49b2-85e1-ef82dbfc6eed) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:52:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:50.580 158365 INFO neutron.agent.ovn.metadata.agent [-] Port e817747d-24b7-49b2-85e1-ef82dbfc6eed in datapath 86002929-7586-4966-8ae7-9d2dade18982 unbound from our chassis#033[00m Feb 1 04:52:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:50.583 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 86002929-7586-4966-8ae7-9d2dade18982, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:52:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:50.584 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[e1caedcd-3a4e-46bf-ab23-b121b689dbec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:52:50 localhost nova_compute[274651]: 2026-02-01 09:52:50.592 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:51 localhost ovn_controller[152492]: 2026-02-01T09:52:51Z|00095|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.085 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.272 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.362 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.363 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.363 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.364 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:52:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:51 localhost dnsmasq[305266]: exiting on receipt of SIGTERM Feb 1 04:52:51 localhost podman[306611]: 2026-02-01 09:52:51.660065567 +0000 UTC m=+0.057489953 container kill a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:52:51 localhost systemd[1]: libpod-a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32.scope: Deactivated successfully. Feb 1 04:52:51 localhost podman[306631]: 2026-02-01 09:52:51.724271757 +0000 UTC m=+0.040797009 container died a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:52:51 localhost systemd[1]: tmp-crun.2q7CGE.mount: Deactivated successfully. Feb 1 04:52:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32-userdata-shm.mount: Deactivated successfully. Feb 1 04:52:51 localhost systemd[1]: var-lib-containers-storage-overlay-3f3642e65e1a67f740d3f6658334c166a76187026a6295041dcedec21bc8a1d9-merged.mount: Deactivated successfully. Feb 1 04:52:51 localhost podman[306631]: 2026-02-01 09:52:51.768080658 +0000 UTC m=+0.084605800 container remove a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:52:51 localhost systemd[1]: libpod-conmon-a9aef9b100bed7037f8da9e2263353420780082e90a9d50925594cc8ec316c32.scope: Deactivated successfully. Feb 1 04:52:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:51.788 259320 INFO neutron.agent.dhcp.agent [None req-ec192300-4eb5-48cd-8097-2eba31cc8b27 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.913 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.931 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.931 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.932 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:51 localhost nova_compute[274651]: 2026-02-01 09:52:51.932 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:51.947 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:52:52 localhost systemd[1]: run-netns-qdhcp\x2d01cb494b\x2d1310\x2d460f\x2dacbe\x2d602aefea39c6.mount: Deactivated successfully. Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.295 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.295 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.296 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:52:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:52:53 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2238660147' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.747 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.807 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.808 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:52:53 localhost nova_compute[274651]: 2026-02-01 09:52:53.819 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:53 localhost podman[236886]: time="2026-02-01T09:52:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:52:53 localhost ovn_controller[152492]: 2026-02-01T09:52:53Z|00096|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:52:53 localhost podman[236886]: @ - - [01/Feb/2026:09:52:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160185 "" "Go-http-client/1.1" Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.018 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:54 localhost podman[236886]: @ - - [01/Feb/2026:09:52:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19770 "" "Go-http-client/1.1" Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.070 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.071 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11347MB free_disk=41.70050811767578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.072 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.072 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.179 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.180 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.180 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.238 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:52:54 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:52:54 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3534695093' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.707 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.714 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.743 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:52:54 localhost ovn_controller[152492]: 2026-02-01T09:52:54Z|00097|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.779 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.779 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:52:54 localhost nova_compute[274651]: 2026-02-01 09:52:54.807 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:54 localhost systemd[1]: tmp-crun.gALnDy.mount: Deactivated successfully. Feb 1 04:52:54 localhost dnsmasq[306380]: exiting on receipt of SIGTERM Feb 1 04:52:54 localhost podman[306713]: 2026-02-01 09:52:54.991190212 +0000 UTC m=+0.066898564 container kill ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:52:54 localhost systemd[1]: libpod-ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104.scope: Deactivated successfully. Feb 1 04:52:55 localhost podman[306728]: 2026-02-01 09:52:55.068797525 +0000 UTC m=+0.055656138 container died ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:52:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104-userdata-shm.mount: Deactivated successfully. Feb 1 04:52:55 localhost podman[306728]: 2026-02-01 09:52:55.163375321 +0000 UTC m=+0.150233884 container remove ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86002929-7586-4966-8ae7-9d2dade18982, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:52:55 localhost systemd[1]: libpod-conmon-ac40006731eac9c55259fcd7c4ccc99ff54fdb1247c0dc92375acd12f595f104.scope: Deactivated successfully. Feb 1 04:52:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:55.386 259320 INFO neutron.agent.dhcp.agent [None req-5ff5ee6a-efd3-4349-b61d-8572567a9cec - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:52:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:55.445 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:52:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:52:55.694 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:52:55 localhost systemd[1]: var-lib-containers-storage-overlay-73bd4b69fa62e7385f2f2d34bf53f8c24bd423a6199c9208614b953ec4148b92-merged.mount: Deactivated successfully. Feb 1 04:52:55 localhost systemd[1]: run-netns-qdhcp\x2d86002929\x2d7586\x2d4966\x2d8ae7\x2d9d2dade18982.mount: Deactivated successfully. Feb 1 04:52:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:52:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e100 do_prune osdmap full prune enabled Feb 1 04:52:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e101 e101: 6 total, 6 up, 6 in Feb 1 04:52:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e101: 6 total, 6 up, 6 in Feb 1 04:52:56 localhost nova_compute[274651]: 2026-02-01 09:52:56.596 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:56 localhost nova_compute[274651]: 2026-02-01 09:52:56.780 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:56 localhost nova_compute[274651]: 2026-02-01 09:52:56.781 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:52:58 localhost podman[306754]: 2026-02-01 09:52:58.7297787 +0000 UTC m=+0.086176928 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:52:58 localhost podman[306754]: 2026-02-01 09:52:58.738165779 +0000 UTC m=+0.094563977 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:52:58 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:52:58 localhost nova_compute[274651]: 2026-02-01 09:52:58.865 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:59 localhost nova_compute[274651]: 2026-02-01 09:52:59.410 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:52:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:52:59.509 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:53:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:01 localhost openstack_network_exporter[239441]: ERROR 09:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:53:01 localhost openstack_network_exporter[239441]: Feb 1 04:53:01 localhost openstack_network_exporter[239441]: ERROR 09:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:53:01 localhost openstack_network_exporter[239441]: Feb 1 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:53:01 localhost podman[306778]: 2026-02-01 09:53:01.724629386 +0000 UTC m=+0.085602690 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:53:01 localhost podman[306778]: 2026-02-01 09:53:01.759495622 +0000 UTC m=+0.120468946 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:53:01 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:53:03 localhost ovn_controller[152492]: 2026-02-01T09:53:03Z|00098|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:53:03 localhost nova_compute[274651]: 2026-02-01 09:53:03.334 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e101 do_prune osdmap full prune enabled Feb 1 04:53:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e102 e102: 6 total, 6 up, 6 in Feb 1 04:53:03 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e102: 6 total, 6 up, 6 in Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.529 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.535 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00753e40-e287-4b46-a588-e0e6a1cf0c86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.530265', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd02ba44-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '68ac290d61879d0cbd3b1bef93fcc46b082fb64ba774bb5ad94b5398b6410677'}]}, 'timestamp': '2026-02-01 09:53:03.536524', '_unique_id': '19a5758920ad4ebba9959333f4b3aca5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.541 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.542 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ff8e873-e2ef-4f1b-8a45-581769c6fed0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.542028', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd03afe4-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': 'd3539c1e66b426179b4bb5cb9dcc2c199df064a117b47a225367f96a114e369b'}]}, 'timestamp': '2026-02-01 09:53:03.542902', '_unique_id': '62a33deed5e64227814b64c4d05f07f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.544 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.547 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.578 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.579 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0544ae3-ad90-4157-b6f5-771ca7a3e76e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.547512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd0942f6-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': '4ea95f979fce5baf4ca2cf54f73d2b2ca9d47eefbe614e219cb8e9a3fc3250a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.547512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd09505c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': 'a4452607e7d1ed181c9b7607c8109cd82fbee1414c4dda58209d4ba4b6cc2d4d'}]}, 'timestamp': '2026-02-01 09:53:03.579499', '_unique_id': '5a1ae49dfaef45b38663abe66fbebfd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.580 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.581 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '225c2359-32fd-42c0-b42f-01914ca75c5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.581509', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd09ab2e-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '0921c065f03f60ca9cb9a7b0aa17e090a824b946fe6c4bd4a4d87eb43d04e70b'}]}, 'timestamp': '2026-02-01 09:53:03.581828', '_unique_id': '08b3cf3cb3874922be933ca519f62040'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.583 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50a356ba-8df0-4561-b81d-093392c53327', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.583234', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd09eda0-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '90682682115ceb339b832e1cc2c0abb20b1b09df1528fc36148c597f79ac45d2'}]}, 'timestamp': '2026-02-01 09:53:03.583524', '_unique_id': 'fe63b4f383874036b63a4d42d683756e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.601 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e1df0dc-bcee-4bbc-9061-af0cbd343bd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:53:03.584920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'cd0cc980-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.821222042, 'message_signature': 'aa82cefa62ba71d93d8b1d9ac2698d1a48a255471a840fb54c63292313c2d92f'}]}, 'timestamp': '2026-02-01 09:53:03.602269', '_unique_id': '49bac668a5f44507aac94da5fb0ac515'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.603 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1e2fed7-93bc-4eb8-b82e-117cf7635d4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.603628', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd0d0a58-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '70bfcd03a8c4510b0063c3b57b1f29371e374b83bd9b725d21a8068ff15af271'}]}, 'timestamp': '2026-02-01 09:53:03.603921', '_unique_id': '138869dc541f43f0809f69f1e812623c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.605 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.605 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.605 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c056368-5452-4b65-848f-90faecd44407', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.605727', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd0d5c24-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '834f94f960a28c9e73989277b1413cb5decd43c685bdd6d049b437d82cb308bd'}]}, 'timestamp': '2026-02-01 09:53:03.606038', '_unique_id': '4c80b7bec5fd487c8f130bf74953ec8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.607 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.616 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '721b379d-98c6-4570-bccf-0d6e6e7e1cfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.607336', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd0f0cc2-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.826749733, 'message_signature': 'a8d09a01decf3b630e55e49f6096ed0c8b95be0f0bea02bb223324df04b9838f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.607336', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd0f1866-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.826749733, 'message_signature': 'c88b84f3202ef277aad465ebcc8db83e08f1e6398d5d9ec313081bf8436432be'}]}, 'timestamp': '2026-02-01 09:53:03.617377', '_unique_id': 'b426bcf0ed2c4111ab5dc932cea3c890'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8e6f050-b825-44d2-8e88-ecc7d72fc5fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.618860', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd0f5e02-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '0da5a7a89dee431bd4bddf6250ee9fab09cb74eb49da6879d61a542999e01026'}]}, 'timestamp': '2026-02-01 09:53:03.619170', '_unique_id': '97e99067fba44eed8305da7778585e22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.620 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.620 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '243bf3a4-e05c-49a3-9984-dcf33fed15aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.620561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd0f9fa2-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': '380e0e3e45893e6436df150ba227c01b3cd7f4d25746c5a2b8f3b9e223acda40'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.620561', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd0fa9de-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': 'dab0f41a75415b2e214e4620335d113b20d308c230111c1b71fa0ee5492e8078'}]}, 'timestamp': '2026-02-01 09:53:03.621121', '_unique_id': 'c096728fe90b4e5ebd7894f5a96b4354'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.621 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.622 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1c64cf3-fb09-49ed-8515-14d2e593bc41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.622569', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd0fee6c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': '6b89876dc6fe7241d6c30bcb250b42de576ed366bd929f40f3865a612a39879e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.622569', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd0ff86c-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': 'a5e2e6fe94c354cbb130e11f38d43e8e8baf342be92b76779980c421dec61374'}]}, 'timestamp': '2026-02-01 09:53:03.623147', '_unique_id': 'd24324094f704012b557decadc73b8ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.624 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.624 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 14960000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab80dfd7-0e95-4861-8825-95f0db86e519', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14960000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:53:03.624719', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'cd1041fa-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.821222042, 'message_signature': '30d86a66373003b874ce4d7a9cb933d26e03acd77950ef4ce6864156b4f7b849'}]}, 'timestamp': '2026-02-01 09:53:03.625020', '_unique_id': 'd1eea0828aa141a3820af47de687e2c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf17222a-f423-493d-aca8-f325e2362664', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.626330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd1080e8-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.826749733, 'message_signature': '0f46cb3a7a222dab4585edd26c803f21dd54d405c4c46c50f8ba9da1abc9063b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.626330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd108b92-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.826749733, 'message_signature': '1fdbbbba4ca6ded9fd331726eaf7cff049e8584a5415bd540106362a9b6719fa'}]}, 'timestamp': '2026-02-01 09:53:03.626870', '_unique_id': '3c38d3f2f6124604974dfe29cbf0c255'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.628 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d872c98-db7e-4947-97fa-4d8c38218ec0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.628274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd10cce2-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': 'd779578a0092116ebe72a2d1b702f612b0a075a340a6ad7c71e87e78d095c4b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.628274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd10d700-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': '13cdcedd68d71f0a0e952c7fb008983899c43426f085c6eed3b6ca53451396fb'}]}, 'timestamp': '2026-02-01 09:53:03.628799', '_unique_id': '4132d84fac9f4307bf12f8187d90d134'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.630 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8316035-0e27-4fc4-bc92-44f3b87d3f6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.630345', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd111dbe-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': '65cec2318f375013c4e706e7aa253549466ee2ee0461c2d9c6d2d56b51be3ac4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.630345', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd1127e6-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': 'ed6eed437684844fca1b08e654d804d7de114635abec01031f4b8dc120d1c55f'}]}, 'timestamp': '2026-02-01 09:53:03.630870', '_unique_id': '51b198e6d7fb47208fad772533a70f5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '087ac5a2-5bb7-48e7-a131-f7a501dcca44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.632210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd1166a2-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.826749733, 'message_signature': 'fe472153222d79f762faf0c046ce717b5a2c6b4900218cac0bcdf1ca2fe0fcc4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.632210', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd1170ac-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.826749733, 'message_signature': 'b4545180ce176c1e2da3c2eea1525111ae6d6855baff659d982ca702e9ec750f'}]}, 'timestamp': '2026-02-01 09:53:03.632736', '_unique_id': 'bbcd586713eb4ff88aefa1a2bf0e1945'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.633 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '944ca526-6851-4beb-9a34-e2c994582f12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.634089', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd11b030-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '75c6ea69e4a2f6bec622f564934f50da591f3fcbeec07d201468ba086a09f9e0'}]}, 'timestamp': '2026-02-01 09:53:03.634382', '_unique_id': '3b0883f86e64471eaae3b6f8ae7be849'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eff0002b-c708-4265-82c7-9e4a905639d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:53:03.635738', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'cd11f090-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': '5ae875fde30c7feeaa148fe39692826bf866f6c467e85733fa78df7edbf2957c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:53:03.635738', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'cd11fe14-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.76698693, 'message_signature': '4e895d7955b1cfe1d8c46a19c930ca51345675b1dccc390742c6ea7aaebd6748'}]}, 'timestamp': '2026-02-01 09:53:03.636506', '_unique_id': 'f5f7f2438b5b46d197136955303dd405'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.637 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69bc6ac6-cd36-438b-9e77-2e2572683b3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.637880', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd124540-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': '5bd882c78010f4689e495104d85ec0f2ef35fa13235e4602aeea88f285627fd9'}]}, 'timestamp': '2026-02-01 09:53:03.638198', '_unique_id': '11add7c1802e4589a7db508fb15007ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29e02aa5-ea0c-423d-913b-e37a548b3b3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:53:03.639467', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'cd1281fe-ff53-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11677.750283445, 'message_signature': 'f5cd76d4ec39aeb1a4ee7724e1e64837fb031616073b0c7c1843a1e5c992e94c'}]}, 'timestamp': '2026-02-01 09:53:03.639749', '_unique_id': '6f61092faa964c1ba276046b44f4fdf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:53:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:53:03.641 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:53:03 localhost nova_compute[274651]: 2026-02-01 09:53:03.905 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e102 do_prune osdmap full prune enabled Feb 1 04:53:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e103 e103: 6 total, 6 up, 6 in Feb 1 04:53:05 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e103: 6 total, 6 up, 6 in Feb 1 04:53:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:53:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:53:05 localhost systemd[1]: tmp-crun.9nvj7a.mount: Deactivated successfully. Feb 1 04:53:05 localhost podman[306797]: 2026-02-01 09:53:05.73643533 +0000 UTC m=+0.076645624 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:53:05 localhost podman[306798]: 2026-02-01 09:53:05.707064635 +0000 UTC m=+0.045956679 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:53:05 localhost podman[306797]: 2026-02-01 09:53:05.772326967 +0000 UTC m=+0.112537221 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:53:05 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:53:05 localhost podman[306798]: 2026-02-01 09:53:05.786132123 +0000 UTC m=+0.125024247 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:53:05 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:53:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e103 do_prune osdmap full prune enabled Feb 1 04:53:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e104 e104: 6 total, 6 up, 6 in Feb 1 04:53:06 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e104: 6 total, 6 up, 6 in Feb 1 04:53:08 localhost nova_compute[274651]: 2026-02-01 09:53:08.906 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:08 localhost nova_compute[274651]: 2026-02-01 09:53:08.908 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:08 localhost nova_compute[274651]: 2026-02-01 09:53:08.909 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:53:08 localhost nova_compute[274651]: 2026-02-01 09:53:08.909 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:08 localhost nova_compute[274651]: 2026-02-01 09:53:08.943 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:08 localhost nova_compute[274651]: 2026-02-01 09:53:08.944 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e104 do_prune osdmap full prune enabled Feb 1 04:53:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e105 e105: 6 total, 6 up, 6 in Feb 1 04:53:11 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e105: 6 total, 6 up, 6 in Feb 1 04:53:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:53:12 localhost podman[306846]: 2026-02-01 09:53:12.705802075 +0000 UTC m=+0.065735288 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=) Feb 1 04:53:12 localhost podman[306846]: 2026-02-01 09:53:12.742644331 +0000 UTC m=+0.102577504 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:53:12 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:53:13 localhost nova_compute[274651]: 2026-02-01 09:53:13.945 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:13 localhost nova_compute[274651]: 2026-02-01 09:53:13.946 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:13 localhost nova_compute[274651]: 2026-02-01 09:53:13.946 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:53:13 localhost nova_compute[274651]: 2026-02-01 09:53:13.946 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:13 localhost nova_compute[274651]: 2026-02-01 09:53:13.991 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:13 localhost nova_compute[274651]: 2026-02-01 09:53:13.992 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:16 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:16.049 2 INFO neutron.agent.securitygroups_rpc [None req-fe72f4fd-5cc1-4afa-94a0-35085a503c7b 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']#033[00m Feb 1 04:53:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e105 do_prune osdmap full prune enabled Feb 1 04:53:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e106 e106: 6 total, 6 up, 6 in Feb 1 04:53:16 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e106: 6 total, 6 up, 6 in Feb 1 04:53:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:17.162 2 INFO neutron.agent.securitygroups_rpc [None req-847588ff-1f30-46d7-9f2d-cc2e866fd5e9 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']#033[00m Feb 1 04:53:18 localhost nova_compute[274651]: 2026-02-01 09:53:18.993 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:18 localhost nova_compute[274651]: 2026-02-01 09:53:18.995 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:18 localhost nova_compute[274651]: 2026-02-01 09:53:18.995 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:53:18 localhost nova_compute[274651]: 2026-02-01 09:53:18.995 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:19 localhost nova_compute[274651]: 2026-02-01 09:53:19.027 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:19 localhost nova_compute[274651]: 2026-02-01 09:53:19.028 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:19 localhost ovn_controller[152492]: 2026-02-01T09:53:19Z|00099|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:53:19 localhost nova_compute[274651]: 2026-02-01 09:53:19.620 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:53:19 localhost podman[306866]: 2026-02-01 09:53:19.722554494 +0000 UTC m=+0.083739513 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 04:53:19 localhost podman[306866]: 2026-02-01 09:53:19.737402632 +0000 UTC m=+0.098587631 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute) Feb 1 04:53:19 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:53:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e106 do_prune osdmap full prune enabled Feb 1 04:53:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e107 e107: 6 total, 6 up, 6 in Feb 1 04:53:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e107: 6 total, 6 up, 6 in Feb 1 04:53:21 localhost dnsmasq[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/addn_hosts - 0 addresses Feb 1 04:53:21 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/host Feb 1 04:53:21 localhost dnsmasq-dhcp[306140]: read /var/lib/neutron/dhcp/f1c9407b-b174-4a64-ba99-055b15628d3f/opts Feb 1 04:53:21 localhost podman[306904]: 2026-02-01 09:53:21.637151645 +0000 UTC m=+0.067190223 container kill 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:53:21 localhost nova_compute[274651]: 2026-02-01 09:53:21.791 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:21 localhost kernel: device tap377a7a10-80 left promiscuous mode Feb 1 04:53:21 localhost ovn_controller[152492]: 2026-02-01T09:53:21Z|00100|binding|INFO|Releasing lport 377a7a10-8059-48da-a5cf-af24631e9999 from this chassis (sb_readonly=0) Feb 1 04:53:21 localhost ovn_controller[152492]: 2026-02-01T09:53:21Z|00101|binding|INFO|Setting lport 377a7a10-8059-48da-a5cf-af24631e9999 down in Southbound Feb 1 04:53:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:21.812 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-f1c9407b-b174-4a64-ba99-055b15628d3f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1c9407b-b174-4a64-ba99-055b15628d3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ef9394e0b21548a491d64bf76f5f6368', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2de141c9-8cc9-4b0e-ba0d-113cc86928d7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=377a7a10-8059-48da-a5cf-af24631e9999) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:53:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:21.814 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 377a7a10-8059-48da-a5cf-af24631e9999 in datapath f1c9407b-b174-4a64-ba99-055b15628d3f unbound from our chassis#033[00m Feb 1 04:53:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:21.816 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1c9407b-b174-4a64-ba99-055b15628d3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:53:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:21.845 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf6276b-6227-49d6-b7dc-db2fd812aa92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:53:21 localhost nova_compute[274651]: 2026-02-01 09:53:21.846 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:22 localhost ovn_controller[152492]: 2026-02-01T09:53:22Z|00102|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:53:22 localhost nova_compute[274651]: 2026-02-01 09:53:22.919 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:23 localhost dnsmasq[306140]: exiting on receipt of SIGTERM Feb 1 04:53:23 localhost podman[306947]: 2026-02-01 09:53:23.332916279 +0000 UTC m=+0.049215639 container kill 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:53:23 localhost systemd[1]: libpod-431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c.scope: Deactivated successfully. Feb 1 04:53:23 localhost podman[306963]: 2026-02-01 09:53:23.39619745 +0000 UTC m=+0.044839194 container died 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:53:23 localhost systemd[1]: tmp-crun.sqoYjp.mount: Deactivated successfully. Feb 1 04:53:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c-userdata-shm.mount: Deactivated successfully. Feb 1 04:53:23 localhost podman[306963]: 2026-02-01 09:53:23.446538023 +0000 UTC m=+0.095179767 container remove 431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f1c9407b-b174-4a64-ba99-055b15628d3f, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:53:23 localhost systemd[1]: libpod-conmon-431293f899a58de7fa4b3116642076c42b3cd30d6e3eee5e702723db27723d6c.scope: Deactivated successfully. Feb 1 04:53:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:23.469 259320 INFO neutron.agent.dhcp.agent [None req-ddbf872d-0a75-4c0d-b676-c6ef420d9d0e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:53:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:23.713 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:53:23 localhost podman[236886]: time="2026-02-01T09:53:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:53:23 localhost podman[236886]: @ - - [01/Feb/2026:09:53:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:53:24 localhost podman[236886]: @ - - [01/Feb/2026:09:53:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18815 "" "Go-http-client/1.1" Feb 1 04:53:24 localhost nova_compute[274651]: 2026-02-01 09:53:24.075 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:24 localhost systemd[1]: var-lib-containers-storage-overlay-6401b23b324e1ba0152b3e13ee96bde9897fa8619d4a073063f4820d9dc3d43e-merged.mount: Deactivated successfully. Feb 1 04:53:24 localhost systemd[1]: run-netns-qdhcp\x2df1c9407b\x2db174\x2d4a64\x2dba99\x2d055b15628d3f.mount: Deactivated successfully. Feb 1 04:53:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:27.812 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:53:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:27.814 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:53:27 localhost nova_compute[274651]: 2026-02-01 09:53:27.849 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:28.193 259320 INFO neutron.agent.linux.ip_lib [None req-7e72c658-e6f4-436e-b676-2d902c69b97f - - - - - -] Device tap3b3ba607-44 cannot be used as it has no MAC address#033[00m Feb 1 04:53:28 localhost nova_compute[274651]: 2026-02-01 09:53:28.216 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:28 localhost kernel: device tap3b3ba607-44 entered promiscuous mode Feb 1 04:53:28 localhost NetworkManager[5964]: [1769939608.2241] manager: (tap3b3ba607-44): new Generic device (/org/freedesktop/NetworkManager/Devices/23) Feb 1 04:53:28 localhost nova_compute[274651]: 2026-02-01 09:53:28.224 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:28 localhost ovn_controller[152492]: 2026-02-01T09:53:28Z|00103|binding|INFO|Claiming lport 3b3ba607-4476-4cd0-86fb-5280cf884987 for this chassis. Feb 1 04:53:28 localhost ovn_controller[152492]: 2026-02-01T09:53:28Z|00104|binding|INFO|3b3ba607-4476-4cd0-86fb-5280cf884987: Claiming unknown Feb 1 04:53:28 localhost systemd-udevd[306998]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:53:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:28.241 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-0f1d21e4-c85f-4885-86ad-e1a93176eb36', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f1d21e4-c85f-4885-86ad-e1a93176eb36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e44a50a3d96541748629cacff5ef78b0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8afb06df-f932-4c73-800b-945adb3b0b6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3b3ba607-4476-4cd0-86fb-5280cf884987) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:53:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:28.245 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 3b3ba607-4476-4cd0-86fb-5280cf884987 in datapath 0f1d21e4-c85f-4885-86ad-e1a93176eb36 bound to our chassis#033[00m Feb 1 04:53:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:28.247 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 91a6d9f4-764b-42c7-a64f-51a3c0c451e9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:53:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:28.248 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f1d21e4-c85f-4885-86ad-e1a93176eb36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:53:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:28.249 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[870eb5e1-6a0e-440a-a2cd-57df79121200]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost ovn_controller[152492]: 2026-02-01T09:53:28Z|00105|binding|INFO|Setting lport 3b3ba607-4476-4cd0-86fb-5280cf884987 ovn-installed in OVS Feb 1 04:53:28 localhost ovn_controller[152492]: 2026-02-01T09:53:28Z|00106|binding|INFO|Setting lport 3b3ba607-4476-4cd0-86fb-5280cf884987 up in Southbound Feb 1 04:53:28 localhost nova_compute[274651]: 2026-02-01 09:53:28.261 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost journal[217584]: ethtool ioctl error on tap3b3ba607-44: No such device Feb 1 04:53:28 localhost nova_compute[274651]: 2026-02-01 09:53:28.296 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:28 localhost nova_compute[274651]: 2026-02-01 09:53:28.322 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:28 localhost nova_compute[274651]: 2026-02-01 09:53:28.949 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:29 localhost podman[307070]: Feb 1 04:53:29 localhost podman[307070]: 2026-02-01 09:53:29.056191101 +0000 UTC m=+0.072740004 container create 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:53:29 localhost nova_compute[274651]: 2026-02-01 09:53:29.078 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:53:29 localhost systemd[1]: Started libpod-conmon-86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22.scope. Feb 1 04:53:29 localhost systemd[1]: Started libcrun container. Feb 1 04:53:29 localhost podman[307070]: 2026-02-01 09:53:29.01661805 +0000 UTC m=+0.033167033 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:53:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe75a22cad3fe552e973adc0face95782e7e1a5046c3b96376534b090df139df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:53:29 localhost podman[307070]: 2026-02-01 09:53:29.132129343 +0000 UTC m=+0.148678256 container init 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 1 04:53:29 localhost podman[307070]: 2026-02-01 09:53:29.143763311 +0000 UTC m=+0.160312214 container start 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:53:29 localhost dnsmasq[307100]: started, version 2.85 cachesize 150 Feb 1 04:53:29 localhost dnsmasq[307100]: DNS service limited to local subnets Feb 1 04:53:29 localhost dnsmasq[307100]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:53:29 localhost dnsmasq[307100]: warning: no upstream servers configured Feb 1 04:53:29 localhost dnsmasq-dhcp[307100]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:53:29 localhost dnsmasq[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/addn_hosts - 0 addresses Feb 1 04:53:29 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/host Feb 1 04:53:29 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/opts Feb 1 04:53:29 localhost podman[307085]: 2026-02-01 09:53:29.185743936 +0000 UTC m=+0.081485254 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:53:29 localhost podman[307085]: 2026-02-01 09:53:29.22642378 +0000 UTC m=+0.122165048 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:53:29 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:53:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:29.370 259320 INFO neutron.agent.dhcp.agent [None req-b9dd964f-4e65-4dce-80c1-d6d78380c7e2 - - - - - -] DHCP configuration for ports {'b4361a18-966c-4c27-a849-280c6a4cba5f'} is completed#033[00m Feb 1 04:53:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:29.858 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:53:29Z, description=, device_id=80ec8b40-dd28-494e-b967-9cf1f3ca7f06, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=72fe8d00-4921-47d0-a06b-17d45e8c7b5d, ip_allocation=immediate, mac_address=fa:16:3e:e4:3d:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:53:26Z, description=, dns_domain=, id=0f1d21e4-c85f-4885-86ad-e1a93176eb36, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-1833787910-network, port_security_enabled=True, project_id=e44a50a3d96541748629cacff5ef78b0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=731, status=ACTIVE, subnets=['861f5560-d250-4a0d-aa95-1bffc1906f0c'], tags=[], tenant_id=e44a50a3d96541748629cacff5ef78b0, updated_at=2026-02-01T09:53:26Z, vlan_transparent=None, network_id=0f1d21e4-c85f-4885-86ad-e1a93176eb36, port_security_enabled=False, project_id=e44a50a3d96541748629cacff5ef78b0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=739, status=DOWN, tags=[], tenant_id=e44a50a3d96541748629cacff5ef78b0, updated_at=2026-02-01T09:53:29Z on network 0f1d21e4-c85f-4885-86ad-e1a93176eb36#033[00m Feb 1 04:53:30 localhost systemd[1]: tmp-crun.hXeqwV.mount: Deactivated successfully. Feb 1 04:53:30 localhost dnsmasq[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/addn_hosts - 1 addresses Feb 1 04:53:30 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/host Feb 1 04:53:30 localhost podman[307127]: 2026-02-01 09:53:30.234222978 +0000 UTC m=+0.070682140 container kill 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:53:30 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/opts Feb 1 04:53:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:30.507 259320 INFO neutron.agent.dhcp.agent [None req-1047a3df-27df-4e07-9956-2e4c099d0f6c - - - - - -] DHCP configuration for ports {'72fe8d00-4921-47d0-a06b-17d45e8c7b5d'} is completed#033[00m Feb 1 04:53:31 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:31.259 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:53:29Z, description=, device_id=80ec8b40-dd28-494e-b967-9cf1f3ca7f06, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=72fe8d00-4921-47d0-a06b-17d45e8c7b5d, ip_allocation=immediate, mac_address=fa:16:3e:e4:3d:24, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:53:26Z, description=, dns_domain=, id=0f1d21e4-c85f-4885-86ad-e1a93176eb36, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-1833787910-network, port_security_enabled=True, project_id=e44a50a3d96541748629cacff5ef78b0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=33204, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=731, status=ACTIVE, subnets=['861f5560-d250-4a0d-aa95-1bffc1906f0c'], tags=[], tenant_id=e44a50a3d96541748629cacff5ef78b0, updated_at=2026-02-01T09:53:26Z, vlan_transparent=None, network_id=0f1d21e4-c85f-4885-86ad-e1a93176eb36, port_security_enabled=False, project_id=e44a50a3d96541748629cacff5ef78b0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=739, status=DOWN, tags=[], tenant_id=e44a50a3d96541748629cacff5ef78b0, updated_at=2026-02-01T09:53:29Z on network 0f1d21e4-c85f-4885-86ad-e1a93176eb36#033[00m Feb 1 04:53:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e107 do_prune osdmap full prune enabled Feb 1 04:53:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e108 e108: 6 total, 6 up, 6 in Feb 1 04:53:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e108: 6 total, 6 up, 6 in Feb 1 04:53:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:31 localhost openstack_network_exporter[239441]: ERROR 09:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:53:31 localhost openstack_network_exporter[239441]: Feb 1 04:53:31 localhost openstack_network_exporter[239441]: ERROR 09:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:53:31 localhost openstack_network_exporter[239441]: Feb 1 04:53:31 localhost podman[307164]: 2026-02-01 09:53:31.521251136 +0000 UTC m=+0.068659808 container kill 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:53:31 localhost dnsmasq[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/addn_hosts - 1 addresses Feb 1 04:53:31 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/host Feb 1 04:53:31 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/opts Feb 1 04:53:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:31.816 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:53:31 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:31.967 259320 INFO neutron.agent.dhcp.agent [None req-4166d453-f013-4aff-b9e7-16febc72289b - - - - - -] DHCP configuration for ports {'72fe8d00-4921-47d0-a06b-17d45e8c7b5d'} is completed#033[00m Feb 1 04:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:53:32 localhost systemd[1]: tmp-crun.Yp2YBS.mount: Deactivated successfully. Feb 1 04:53:32 localhost podman[307201]: 2026-02-01 09:53:32.283524674 +0000 UTC m=+0.093336870 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:53:32 localhost podman[307201]: 2026-02-01 09:53:32.314324943 +0000 UTC m=+0.124137139 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible) Feb 1 04:53:32 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:53:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:53:33 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:53:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e108 do_prune osdmap full prune enabled Feb 1 04:53:33 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:53:33 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:53:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e109 e109: 6 total, 6 up, 6 in Feb 1 04:53:33 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e109: 6 total, 6 up, 6 in Feb 1 04:53:34 localhost nova_compute[274651]: 2026-02-01 09:53:34.080 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:34 localhost nova_compute[274651]: 2026-02-01 09:53:34.082 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:34 localhost nova_compute[274651]: 2026-02-01 09:53:34.082 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:53:34 localhost nova_compute[274651]: 2026-02-01 09:53:34.082 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:34 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:34.092 2 INFO neutron.agent.securitygroups_rpc [req-75bc0aa1-37ef-492b-a96a-ae9080ee75e0 req-edda2748-5bcc-42c3-8a62-8fe3b52553b6 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['f0c61cda-1998-487f-b5b2-ae9c4848f56a']#033[00m Feb 1 04:53:34 localhost nova_compute[274651]: 2026-02-01 09:53:34.101 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:34 localhost nova_compute[274651]: 2026-02-01 09:53:34.103 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:53:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2427119556' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:53:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:53:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2427119556' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:53:34 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:34.830 2 INFO neutron.agent.securitygroups_rpc [req-82289f6f-42d9-438b-b261-05589eed2efe req-98f019aa-b49d-4aef-94dc-ee96ed3719e9 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['ada4c3f2-cdfe-4dd3-85f7-4e743664f11d']#033[00m Feb 1 04:53:35 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:35.782 2 INFO neutron.agent.securitygroups_rpc [req-91571c20-84f3-4df1-9546-4b115d3d0f93 req-0f489c88-5b53-4bd7-842f-0fffa7ebc222 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['879f68ae-8832-4697-b764-9db0f8c3108c']#033[00m Feb 1 04:53:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:53:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:53:36 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:36.618 2 INFO neutron.agent.securitygroups_rpc [req-876e83f6-6b17-43eb-b040-065589623e5f req-6a5cb928-de96-4098-a70d-23e5abe4d6ce dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['98cb19e2-acc2-4297-8b83-10025f09d04b']#033[00m Feb 1 04:53:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:53:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:53:36 localhost podman[307288]: 2026-02-01 09:53:36.746259514 +0000 UTC m=+0.088348096 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:53:36 localhost podman[307288]: 2026-02-01 09:53:36.760819513 +0000 UTC m=+0.102908105 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:53:36 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:53:36 localhost podman[307289]: 2026-02-01 09:53:36.839114358 +0000 UTC m=+0.176199915 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller) Feb 1 04:53:36 localhost podman[307289]: 2026-02-01 09:53:36.873356593 +0000 UTC m=+0.210442150 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller) Feb 1 04:53:36 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:53:37 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:53:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:37.589 2 INFO neutron.agent.securitygroups_rpc [req-92fefcaa-7c6d-4e1a-a20a-67097b188e7e req-0e7bfb69-2aec-4c7e-b452-953e89b3814f dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['374381c7-702b-4257-92ff-7af171862681']#033[00m Feb 1 04:53:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:37.762 2 INFO neutron.agent.securitygroups_rpc [req-2c86665d-931d-4ec0-bcba-f7d3083dc82f req-7b6541b5-97df-453e-9e47-7bd01fb85ab0 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['374381c7-702b-4257-92ff-7af171862681']#033[00m Feb 1 04:53:38 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:38.105 2 INFO neutron.agent.securitygroups_rpc [req-eb564804-d725-4326-adb0-58732ff445a4 req-615b10cf-445f-485b-a611-28146c91f72c dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['374381c7-702b-4257-92ff-7af171862681']#033[00m Feb 1 04:53:39 localhost nova_compute[274651]: 2026-02-01 09:53:39.103 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:39 localhost nova_compute[274651]: 2026-02-01 09:53:39.105 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:53:39 localhost nova_compute[274651]: 2026-02-01 09:53:39.106 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:53:39 localhost nova_compute[274651]: 2026-02-01 09:53:39.106 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:39 localhost nova_compute[274651]: 2026-02-01 09:53:39.142 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:39 localhost nova_compute[274651]: 2026-02-01 09:53:39.143 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:53:39 localhost nova_compute[274651]: 2026-02-01 09:53:39.145 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:41 localhost dnsmasq[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/addn_hosts - 0 addresses Feb 1 04:53:41 localhost podman[307351]: 2026-02-01 09:53:41.103645485 +0000 UTC m=+0.067740639 container kill 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:53:41 localhost systemd[1]: tmp-crun.HpDXgh.mount: Deactivated successfully. Feb 1 04:53:41 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/host Feb 1 04:53:41 localhost dnsmasq-dhcp[307100]: read /var/lib/neutron/dhcp/0f1d21e4-c85f-4885-86ad-e1a93176eb36/opts Feb 1 04:53:41 localhost nova_compute[274651]: 2026-02-01 09:53:41.281 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:41 localhost ovn_controller[152492]: 2026-02-01T09:53:41Z|00107|binding|INFO|Releasing lport 3b3ba607-4476-4cd0-86fb-5280cf884987 from this chassis (sb_readonly=0) Feb 1 04:53:41 localhost ovn_controller[152492]: 2026-02-01T09:53:41Z|00108|binding|INFO|Setting lport 3b3ba607-4476-4cd0-86fb-5280cf884987 down in Southbound Feb 1 04:53:41 localhost kernel: device tap3b3ba607-44 left promiscuous mode Feb 1 04:53:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:41.298 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-0f1d21e4-c85f-4885-86ad-e1a93176eb36', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f1d21e4-c85f-4885-86ad-e1a93176eb36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e44a50a3d96541748629cacff5ef78b0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8afb06df-f932-4c73-800b-945adb3b0b6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3b3ba607-4476-4cd0-86fb-5280cf884987) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:53:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:41.300 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 3b3ba607-4476-4cd0-86fb-5280cf884987 in datapath 0f1d21e4-c85f-4885-86ad-e1a93176eb36 unbound from our chassis#033[00m Feb 1 04:53:41 localhost nova_compute[274651]: 2026-02-01 09:53:41.302 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:41.303 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f1d21e4-c85f-4885-86ad-e1a93176eb36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:53:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:41.304 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f4e8ef-b3ff-41ec-bd88-8c034ede4f97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:53:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e109 do_prune osdmap full prune enabled Feb 1 04:53:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e110 e110: 6 total, 6 up, 6 in Feb 1 04:53:41 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e110: 6 total, 6 up, 6 in Feb 1 04:53:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:41.716 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:53:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:41.716 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:53:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:53:41.717 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:53:42 localhost nova_compute[274651]: 2026-02-01 09:53:42.139 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:42 localhost ovn_controller[152492]: 2026-02-01T09:53:42Z|00109|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:53:42 localhost nova_compute[274651]: 2026-02-01 09:53:42.786 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:53:43 localhost podman[307390]: 2026-02-01 09:53:43.640735264 +0000 UTC m=+0.064357646 container kill 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:53:43 localhost dnsmasq[307100]: exiting on receipt of SIGTERM Feb 1 04:53:43 localhost systemd[1]: libpod-86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22.scope: Deactivated successfully. Feb 1 04:53:43 localhost podman[307411]: 2026-02-01 09:53:43.742424959 +0000 UTC m=+0.076076646 container died 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:53:43 localhost podman[307403]: 2026-02-01 09:53:43.754067958 +0000 UTC m=+0.102206622 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.tags=minimal rhel9, vcs-type=git, release=1769056855, name=ubi9/ubi-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 1 04:53:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22-userdata-shm.mount: Deactivated successfully. Feb 1 04:53:43 localhost podman[307403]: 2026-02-01 09:53:43.765381137 +0000 UTC m=+0.113519831 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, version=9.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public) Feb 1 04:53:43 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:53:43 localhost podman[307411]: 2026-02-01 09:53:43.832357673 +0000 UTC m=+0.166009330 container remove 86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f1d21e4-c85f-4885-86ad-e1a93176eb36, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:53:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:43.861 259320 INFO neutron.agent.dhcp.agent [None req-83b0617e-0b21-46ef-910e-7f8d8368d788 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:53:43 localhost systemd[1]: libpod-conmon-86a033b665f937a2d1036024d72b3ec9a91c7448b7edb417986d86310bd3cb22.scope: Deactivated successfully. Feb 1 04:53:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:44.185 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:53:44 localhost nova_compute[274651]: 2026-02-01 09:53:44.187 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:44 localhost systemd[1]: var-lib-containers-storage-overlay-fe75a22cad3fe552e973adc0face95782e7e1a5046c3b96376534b090df139df-merged.mount: Deactivated successfully. Feb 1 04:53:44 localhost systemd[1]: run-netns-qdhcp\x2d0f1d21e4\x2dc85f\x2d4885\x2d86ad\x2de1a93176eb36.mount: Deactivated successfully. Feb 1 04:53:45 localhost nova_compute[274651]: 2026-02-01 09:53:45.446 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:48 localhost ovn_controller[152492]: 2026-02-01T09:53:48Z|00110|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:53:48 localhost nova_compute[274651]: 2026-02-01 09:53:48.294 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:49 localhost nova_compute[274651]: 2026-02-01 09:53:49.220 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:50 localhost nova_compute[274651]: 2026-02-01 09:53:50.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:50 localhost nova_compute[274651]: 2026-02-01 09:53:50.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:53:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:53:50 localhost podman[307446]: 2026-02-01 09:53:50.727337687 +0000 UTC m=+0.084336861 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 1 04:53:50 localhost podman[307446]: 2026-02-01 09:53:50.741397391 +0000 UTC m=+0.098396565 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:53:50 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:53:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:50.797 2 INFO neutron.agent.securitygroups_rpc [None req-200cf6df-4bba-4fb6-b3b8-7b487bc0871d 3ef0026b934441b28e0635d7a99bc592 d1284af7476748758a037c2a7d34b7a2 - - default default] Security group member updated ['02728618-05ed-4a37-93a2-59fcc09c3239']#033[00m Feb 1 04:53:51 localhost nova_compute[274651]: 2026-02-01 09:53:51.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:51 localhost nova_compute[274651]: 2026-02-01 09:53:51.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:53:51 localhost nova_compute[274651]: 2026-02-01 09:53:51.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:53:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:51 localhost nova_compute[274651]: 2026-02-01 09:53:51.473 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:53:51 localhost nova_compute[274651]: 2026-02-01 09:53:51.474 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:53:51 localhost nova_compute[274651]: 2026-02-01 09:53:51.474 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:53:51 localhost nova_compute[274651]: 2026-02-01 09:53:51.475 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:53:51 localhost neutron_sriov_agent[252126]: 2026-02-01 09:53:51.493 2 INFO neutron.agent.securitygroups_rpc [None req-07f1805c-f1e5-49eb-9bf2-554d43f01479 3ef0026b934441b28e0635d7a99bc592 d1284af7476748758a037c2a7d34b7a2 - - default default] Security group member updated ['02728618-05ed-4a37-93a2-59fcc09c3239']#033[00m Feb 1 04:53:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:51.510 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:53:52 localhost nova_compute[274651]: 2026-02-01 09:53:52.431 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:53:52 localhost nova_compute[274651]: 2026-02-01 09:53:52.454 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:53:52 localhost nova_compute[274651]: 2026-02-01 09:53:52.455 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:53:52 localhost nova_compute[274651]: 2026-02-01 09:53:52.456 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:52 localhost nova_compute[274651]: 2026-02-01 09:53:52.457 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:52 localhost nova_compute[274651]: 2026-02-01 09:53:52.457 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:53 localhost nova_compute[274651]: 2026-02-01 09:53:53.451 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:53 localhost nova_compute[274651]: 2026-02-01 09:53:53.454 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e110 do_prune osdmap full prune enabled Feb 1 04:53:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e111 e111: 6 total, 6 up, 6 in Feb 1 04:53:53 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e111: 6 total, 6 up, 6 in Feb 1 04:53:53 localhost podman[236886]: time="2026-02-01T09:53:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:53:53 localhost podman[236886]: @ - - [01/Feb/2026:09:53:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:53:54 localhost podman[236886]: @ - - [01/Feb/2026:09:53:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18823 "" "Go-http-client/1.1" Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.265 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.276 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.293 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.311 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.312 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.312 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.312 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.313 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:53:54 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:53:54 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3002027950' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.754 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.809 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.809 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.975 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.976 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11332MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.976 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:53:54 localhost nova_compute[274651]: 2026-02-01 09:53:54.977 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:53:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:54.994 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.063 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.064 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.064 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.104 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:53:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:53:55 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3808040281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.548 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.556 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.578 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.625 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:53:55 localhost nova_compute[274651]: 2026-02-01 09:53:55.626 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:53:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:53:55.860 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:53:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:53:56 localhost nova_compute[274651]: 2026-02-01 09:53:56.604 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:56 localhost nova_compute[274651]: 2026-02-01 09:53:56.605 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:53:57 localhost nova_compute[274651]: 2026-02-01 09:53:57.285 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e111 do_prune osdmap full prune enabled Feb 1 04:53:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e112 e112: 6 total, 6 up, 6 in Feb 1 04:53:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e112: 6 total, 6 up, 6 in Feb 1 04:53:59 localhost nova_compute[274651]: 2026-02-01 09:53:59.307 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:53:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:53:59 localhost podman[307511]: 2026-02-01 09:53:59.735880728 +0000 UTC m=+0.092836384 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:53:59 localhost podman[307511]: 2026-02-01 09:53:59.74567367 +0000 UTC m=+0.102629306 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:53:59 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:54:00 localhost ovn_controller[152492]: 2026-02-01T09:54:00Z|00111|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:00 localhost nova_compute[274651]: 2026-02-01 09:54:00.058 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:01 localhost nova_compute[274651]: 2026-02-01 09:54:01.219 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:01 localhost openstack_network_exporter[239441]: ERROR 09:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:54:01 localhost openstack_network_exporter[239441]: Feb 1 04:54:01 localhost openstack_network_exporter[239441]: ERROR 09:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:54:01 localhost openstack_network_exporter[239441]: Feb 1 04:54:01 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:01.659 259320 INFO neutron.agent.linux.ip_lib [None req-cdc153a6-485e-48e0-b82f-26c10d160418 - - - - - -] Device tap821a8db1-f4 cannot be used as it has no MAC address#033[00m Feb 1 04:54:01 localhost nova_compute[274651]: 2026-02-01 09:54:01.680 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:01 localhost kernel: device tap821a8db1-f4 entered promiscuous mode Feb 1 04:54:01 localhost NetworkManager[5964]: [1769939641.6901] manager: (tap821a8db1-f4): new Generic device (/org/freedesktop/NetworkManager/Devices/24) Feb 1 04:54:01 localhost nova_compute[274651]: 2026-02-01 09:54:01.690 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:01 localhost ovn_controller[152492]: 2026-02-01T09:54:01Z|00112|binding|INFO|Claiming lport 821a8db1-f47d-477c-9de9-2bfb1d305a52 for this chassis. Feb 1 04:54:01 localhost ovn_controller[152492]: 2026-02-01T09:54:01Z|00113|binding|INFO|821a8db1-f47d-477c-9de9-2bfb1d305a52: Claiming unknown Feb 1 04:54:01 localhost systemd-udevd[307544]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:54:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:01.706 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-f09ff8e4-7935-4b27-a064-f09df44d21eb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f09ff8e4-7935-4b27-a064-f09df44d21eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0194caf1b6343f4859fdcc75c872cf3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9d89f4f-557d-47b9-b228-682369fce5c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=821a8db1-f47d-477c-9de9-2bfb1d305a52) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:01.708 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 821a8db1-f47d-477c-9de9-2bfb1d305a52 in datapath f09ff8e4-7935-4b27-a064-f09df44d21eb bound to our chassis#033[00m Feb 1 04:54:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:01.710 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f09ff8e4-7935-4b27-a064-f09df44d21eb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:54:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:01.714 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d68e46ae-c0e1-481c-b792-487762c00817]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost ovn_controller[152492]: 2026-02-01T09:54:01Z|00114|binding|INFO|Setting lport 821a8db1-f47d-477c-9de9-2bfb1d305a52 ovn-installed in OVS Feb 1 04:54:01 localhost ovn_controller[152492]: 2026-02-01T09:54:01Z|00115|binding|INFO|Setting lport 821a8db1-f47d-477c-9de9-2bfb1d305a52 up in Southbound Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost nova_compute[274651]: 2026-02-01 09:54:01.725 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost journal[217584]: ethtool ioctl error on tap821a8db1-f4: No such device Feb 1 04:54:01 localhost nova_compute[274651]: 2026-02-01 09:54:01.757 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:01 localhost nova_compute[274651]: 2026-02-01 09:54:01.781 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:54:02 localhost podman[307615]: Feb 1 04:54:02 localhost podman[307615]: 2026-02-01 09:54:02.663902051 +0000 UTC m=+0.081464933 container create 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:54:02 localhost podman[307615]: 2026-02-01 09:54:02.625381704 +0000 UTC m=+0.042944516 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:54:02 localhost systemd[1]: Started libpod-conmon-26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206.scope. Feb 1 04:54:02 localhost podman[307628]: 2026-02-01 09:54:02.74395677 +0000 UTC m=+0.102448641 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:54:02 localhost systemd[1]: Started libcrun container. Feb 1 04:54:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e02ff2903735248fe447cf43dddd9e6d237884824be2e4ad83381a8f0055c086/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:54:02 localhost podman[307628]: 2026-02-01 09:54:02.773917414 +0000 UTC m=+0.132409285 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 04:54:02 localhost podman[307615]: 2026-02-01 09:54:02.781706724 +0000 UTC m=+0.199269566 container init 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:54:02 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:54:02 localhost podman[307615]: 2026-02-01 09:54:02.790748573 +0000 UTC m=+0.208311465 container start 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:54:02 localhost dnsmasq[307652]: started, version 2.85 cachesize 150 Feb 1 04:54:02 localhost dnsmasq[307652]: DNS service limited to local subnets Feb 1 04:54:02 localhost dnsmasq[307652]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:54:02 localhost dnsmasq[307652]: warning: no upstream servers configured Feb 1 04:54:02 localhost dnsmasq-dhcp[307652]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:54:02 localhost dnsmasq[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/addn_hosts - 0 addresses Feb 1 04:54:02 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/host Feb 1 04:54:02 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/opts Feb 1 04:54:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:02.931 259320 INFO neutron.agent.dhcp.agent [None req-3b64fe1c-f603-48c4-b178-0c014a0dd713 - - - - - -] DHCP configuration for ports {'bf6baadf-b37b-4f9e-98c9-99ef62996289'} is completed#033[00m Feb 1 04:54:03 localhost systemd[1]: tmp-crun.sD6Zp1.mount: Deactivated successfully. Feb 1 04:54:04 localhost nova_compute[274651]: 2026-02-01 09:54:04.354 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:04 localhost nova_compute[274651]: 2026-02-01 09:54:04.542 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:06 localhost ovn_controller[152492]: 2026-02-01T09:54:06Z|00116|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:06 localhost nova_compute[274651]: 2026-02-01 09:54:06.476 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e112 do_prune osdmap full prune enabled Feb 1 04:54:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e113 e113: 6 total, 6 up, 6 in Feb 1 04:54:06 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e113: 6 total, 6 up, 6 in Feb 1 04:54:06 localhost nova_compute[274651]: 2026-02-01 09:54:06.827 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:54:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:54:07 localhost podman[307653]: 2026-02-01 09:54:07.35555741 +0000 UTC m=+0.087731696 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:54:07 localhost podman[307653]: 2026-02-01 09:54:07.385837124 +0000 UTC m=+0.118011360 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:54:07 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:54:07 localhost podman[307654]: 2026-02-01 09:54:07.459406503 +0000 UTC m=+0.186351567 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:54:07 localhost nova_compute[274651]: 2026-02-01 09:54:07.538 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:07 localhost podman[307654]: 2026-02-01 09:54:07.540236306 +0000 UTC m=+0.267181370 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:54:07 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:54:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:07.856 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:07Z, description=, device_id=47e6d06a-f907-42da-b63d-1bdd83d811d3, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=36870a86-a17a-49b4-b37b-8762456453c1, ip_allocation=immediate, mac_address=fa:16:3e:dc:e0:8a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:00Z, description=, dns_domain=, id=f09ff8e4-7935-4b27-a064-f09df44d21eb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-2087970871-network, port_security_enabled=True, project_id=d0194caf1b6343f4859fdcc75c872cf3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4266, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=990, status=ACTIVE, subnets=['4da32f84-dd1b-4592-a77c-98256725b0da'], tags=[], tenant_id=d0194caf1b6343f4859fdcc75c872cf3, updated_at=2026-02-01T09:54:00Z, vlan_transparent=None, network_id=f09ff8e4-7935-4b27-a064-f09df44d21eb, port_security_enabled=False, project_id=d0194caf1b6343f4859fdcc75c872cf3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1050, status=DOWN, tags=[], tenant_id=d0194caf1b6343f4859fdcc75c872cf3, updated_at=2026-02-01T09:54:07Z on network f09ff8e4-7935-4b27-a064-f09df44d21eb#033[00m Feb 1 04:54:07 localhost ovn_controller[152492]: 2026-02-01T09:54:07Z|00117|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:07 localhost nova_compute[274651]: 2026-02-01 09:54:07.898 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:07.937 259320 INFO neutron.agent.linux.ip_lib [None req-267d7208-a899-4e79-8881-a0119e5f9c6f - - - - - -] Device tapac0ff582-77 cannot be used as it has no MAC address#033[00m Feb 1 04:54:07 localhost nova_compute[274651]: 2026-02-01 09:54:07.959 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:07 localhost kernel: device tapac0ff582-77 entered promiscuous mode Feb 1 04:54:07 localhost NetworkManager[5964]: [1769939647.9652] manager: (tapac0ff582-77): new Generic device (/org/freedesktop/NetworkManager/Devices/25) Feb 1 04:54:07 localhost nova_compute[274651]: 2026-02-01 09:54:07.966 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:07 localhost ovn_controller[152492]: 2026-02-01T09:54:07Z|00118|binding|INFO|Claiming lport ac0ff582-773f-444a-b9f8-7ad2e4c9a959 for this chassis. Feb 1 04:54:07 localhost ovn_controller[152492]: 2026-02-01T09:54:07Z|00119|binding|INFO|ac0ff582-773f-444a-b9f8-7ad2e4c9a959: Claiming unknown Feb 1 04:54:07 localhost systemd-udevd[307726]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:54:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:07.979 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bbefd3c06294b7fa7720ba6ca48fa4b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3953cc01-ee9b-4241-8aee-2a63e36d4fe2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ac0ff582-773f-444a-b9f8-7ad2e4c9a959) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:07.981 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ac0ff582-773f-444a-b9f8-7ad2e4c9a959 in datapath 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da bound to our chassis#033[00m Feb 1 04:54:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:07.982 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:54:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:07.984 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf48814-c66f-421c-a49e-0e45c2f7e92d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:07 localhost ovn_controller[152492]: 2026-02-01T09:54:07Z|00120|binding|INFO|Setting lport ac0ff582-773f-444a-b9f8-7ad2e4c9a959 ovn-installed in OVS Feb 1 04:54:07 localhost ovn_controller[152492]: 2026-02-01T09:54:07Z|00121|binding|INFO|Setting lport ac0ff582-773f-444a-b9f8-7ad2e4c9a959 up in Southbound Feb 1 04:54:07 localhost nova_compute[274651]: 2026-02-01 09:54:07.989 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:08 localhost nova_compute[274651]: 2026-02-01 09:54:08.008 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:08 localhost nova_compute[274651]: 2026-02-01 09:54:08.041 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:08 localhost dnsmasq[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/addn_hosts - 1 addresses Feb 1 04:54:08 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/host Feb 1 04:54:08 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/opts Feb 1 04:54:08 localhost podman[307730]: 2026-02-01 09:54:08.063151201 +0000 UTC m=+0.055632546 container kill 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:54:08 localhost nova_compute[274651]: 2026-02-01 09:54:08.064 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:08.306 259320 INFO neutron.agent.dhcp.agent [None req-5ea1279b-eb62-4aaf-8320-e3ac8beecf19 - - - - - -] DHCP configuration for ports {'36870a86-a17a-49b4-b37b-8762456453c1'} is completed#033[00m Feb 1 04:54:08 localhost systemd[1]: tmp-crun.R47mim.mount: Deactivated successfully. Feb 1 04:54:08 localhost podman[307805]: Feb 1 04:54:08 localhost podman[307805]: 2026-02-01 09:54:08.782107892 +0000 UTC m=+0.083843747 container create 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:08 localhost systemd[1]: Started libpod-conmon-7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c.scope. Feb 1 04:54:08 localhost systemd[1]: Started libcrun container. Feb 1 04:54:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c7616f1935656f59b7721f88f7d77097d4a6e73c9a76d7c99371f9fc319b6e6c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:54:08 localhost podman[307805]: 2026-02-01 09:54:08.836419227 +0000 UTC m=+0.138155072 container init 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:54:08 localhost podman[307805]: 2026-02-01 09:54:08.841817864 +0000 UTC m=+0.143553709 container start 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:54:08 localhost podman[307805]: 2026-02-01 09:54:08.744001957 +0000 UTC m=+0.045737822 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:54:08 localhost dnsmasq[307823]: started, version 2.85 cachesize 150 Feb 1 04:54:08 localhost dnsmasq[307823]: DNS service limited to local subnets Feb 1 04:54:08 localhost dnsmasq[307823]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:54:08 localhost dnsmasq[307823]: warning: no upstream servers configured Feb 1 04:54:08 localhost dnsmasq-dhcp[307823]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:54:08 localhost dnsmasq[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/addn_hosts - 0 addresses Feb 1 04:54:08 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/host Feb 1 04:54:08 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/opts Feb 1 04:54:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:08.956 259320 INFO neutron.agent.dhcp.agent [None req-39f00a51-f059-4146-9153-766611b8d1f4 - - - - - -] DHCP configuration for ports {'313ee5c0-0154-463e-ab35-954149cd83d2'} is completed#033[00m Feb 1 04:54:09 localhost nova_compute[274651]: 2026-02-01 09:54:09.397 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:09.920 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:07Z, description=, device_id=47e6d06a-f907-42da-b63d-1bdd83d811d3, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=36870a86-a17a-49b4-b37b-8762456453c1, ip_allocation=immediate, mac_address=fa:16:3e:dc:e0:8a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:00Z, description=, dns_domain=, id=f09ff8e4-7935-4b27-a064-f09df44d21eb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-2087970871-network, port_security_enabled=True, project_id=d0194caf1b6343f4859fdcc75c872cf3, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4266, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=990, status=ACTIVE, subnets=['4da32f84-dd1b-4592-a77c-98256725b0da'], tags=[], tenant_id=d0194caf1b6343f4859fdcc75c872cf3, updated_at=2026-02-01T09:54:00Z, vlan_transparent=None, network_id=f09ff8e4-7935-4b27-a064-f09df44d21eb, port_security_enabled=False, project_id=d0194caf1b6343f4859fdcc75c872cf3, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1050, status=DOWN, tags=[], tenant_id=d0194caf1b6343f4859fdcc75c872cf3, updated_at=2026-02-01T09:54:07Z on network f09ff8e4-7935-4b27-a064-f09df44d21eb#033[00m Feb 1 04:54:10 localhost dnsmasq[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/addn_hosts - 1 addresses Feb 1 04:54:10 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/host Feb 1 04:54:10 localhost podman[307841]: 2026-02-01 09:54:10.151434579 +0000 UTC m=+0.067120431 container kill 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:54:10 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/opts Feb 1 04:54:10 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:10.401 259320 INFO neutron.agent.dhcp.agent [None req-3389440e-c893-45c1-81ae-88a6cf2d34aa - - - - - -] DHCP configuration for ports {'36870a86-a17a-49b4-b37b-8762456453c1'} is completed#033[00m Feb 1 04:54:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:11 localhost nova_compute[274651]: 2026-02-01 09:54:11.772 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:12.653 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:12Z, description=, device_id=0744a5ee-a858-4493-919a-c17db5919a8b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7168ef4e-f888-44f3-8bb5-f4cfcd1b6907, ip_allocation=immediate, mac_address=fa:16:3e:be:84:99, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:05Z, description=, dns_domain=, id=5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-676799361-network, port_security_enabled=True, project_id=9bbefd3c06294b7fa7720ba6ca48fa4b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63331, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1040, status=ACTIVE, subnets=['a01ac7ec-edd9-46ed-82e6-8213d1b3e830'], tags=[], tenant_id=9bbefd3c06294b7fa7720ba6ca48fa4b, updated_at=2026-02-01T09:54:06Z, vlan_transparent=None, network_id=5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, port_security_enabled=False, project_id=9bbefd3c06294b7fa7720ba6ca48fa4b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1087, status=DOWN, tags=[], tenant_id=9bbefd3c06294b7fa7720ba6ca48fa4b, updated_at=2026-02-01T09:54:12Z on network 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da#033[00m Feb 1 04:54:12 localhost systemd[1]: tmp-crun.v9By0I.mount: Deactivated successfully. Feb 1 04:54:12 localhost dnsmasq[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/addn_hosts - 1 addresses Feb 1 04:54:12 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/host Feb 1 04:54:12 localhost podman[307878]: 2026-02-01 09:54:12.894548788 +0000 UTC m=+0.082645679 container kill 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:54:12 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/opts Feb 1 04:54:13 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:13.148 259320 INFO neutron.agent.dhcp.agent [None req-71fcbb72-98a4-4e70-926a-f43ef59cf754 - - - - - -] DHCP configuration for ports {'7168ef4e-f888-44f3-8bb5-f4cfcd1b6907'} is completed#033[00m Feb 1 04:54:13 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:13.756 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:12Z, description=, device_id=0744a5ee-a858-4493-919a-c17db5919a8b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7168ef4e-f888-44f3-8bb5-f4cfcd1b6907, ip_allocation=immediate, mac_address=fa:16:3e:be:84:99, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:05Z, description=, dns_domain=, id=5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-676799361-network, port_security_enabled=True, project_id=9bbefd3c06294b7fa7720ba6ca48fa4b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63331, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1040, status=ACTIVE, subnets=['a01ac7ec-edd9-46ed-82e6-8213d1b3e830'], tags=[], tenant_id=9bbefd3c06294b7fa7720ba6ca48fa4b, updated_at=2026-02-01T09:54:06Z, vlan_transparent=None, network_id=5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, port_security_enabled=False, project_id=9bbefd3c06294b7fa7720ba6ca48fa4b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1087, status=DOWN, tags=[], tenant_id=9bbefd3c06294b7fa7720ba6ca48fa4b, updated_at=2026-02-01T09:54:12Z on network 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da#033[00m Feb 1 04:54:13 localhost systemd[1]: tmp-crun.4DLPhm.mount: Deactivated successfully. Feb 1 04:54:13 localhost dnsmasq[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/addn_hosts - 1 addresses Feb 1 04:54:13 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/host Feb 1 04:54:13 localhost podman[307916]: 2026-02-01 09:54:13.967151085 +0000 UTC m=+0.063536341 container kill 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:13 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/opts Feb 1 04:54:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:54:14 localhost podman[307929]: 2026-02-01 09:54:14.073659259 +0000 UTC m=+0.081151774 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:54:14 localhost podman[307929]: 2026-02-01 09:54:14.087222397 +0000 UTC m=+0.094714902 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-type=git, managed_by=edpm_ansible, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 04:54:14 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:54:14 localhost nova_compute[274651]: 2026-02-01 09:54:14.150 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:14.218 259320 INFO neutron.agent.dhcp.agent [None req-e230772e-5083-4d0e-884d-f9d1edaeb7f3 - - - - - -] DHCP configuration for ports {'7168ef4e-f888-44f3-8bb5-f4cfcd1b6907'} is completed#033[00m Feb 1 04:54:14 localhost nova_compute[274651]: 2026-02-01 09:54:14.402 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:15 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:15.124 2 INFO neutron.agent.securitygroups_rpc [req-a317a60f-1d94-4e94-8ad3-4c22c8825b6a req-40e6bba5-b2d9-4d66-aeb2-e562a81ad61e aacab7e8f6444706a62ff16c6574833f d0194caf1b6343f4859fdcc75c872cf3 - - default default] Security group rule updated ['639fab50-7eda-41c7-96b9-ca352e9a9f06']#033[00m Feb 1 04:54:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e113 do_prune osdmap full prune enabled Feb 1 04:54:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e114 e114: 6 total, 6 up, 6 in Feb 1 04:54:15 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e114: 6 total, 6 up, 6 in Feb 1 04:54:15 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:15.722 2 INFO neutron.agent.securitygroups_rpc [req-4b9050d5-0e1f-4517-a597-752dbe7a20e4 req-3b9daee1-df3b-4626-857b-13f8996518fb aacab7e8f6444706a62ff16c6574833f d0194caf1b6343f4859fdcc75c872cf3 - - default default] Security group rule updated ['639fab50-7eda-41c7-96b9-ca352e9a9f06']#033[00m Feb 1 04:54:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e114 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.498656) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656498702, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1799, "num_deletes": 261, "total_data_size": 1846979, "memory_usage": 1878752, "flush_reason": "Manual Compaction"} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656510505, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1775229, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26242, "largest_seqno": 28040, "table_properties": {"data_size": 1767717, "index_size": 4405, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16439, "raw_average_key_size": 20, "raw_value_size": 1752217, "raw_average_value_size": 2187, "num_data_blocks": 193, "num_entries": 801, "num_filter_entries": 801, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939534, "oldest_key_time": 1769939534, "file_creation_time": 1769939656, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 11902 microseconds, and 5211 cpu microseconds. Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.510556) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1775229 bytes OK Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.510584) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.512762) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.512788) EVENT_LOG_v1 {"time_micros": 1769939656512780, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.512810) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1839177, prev total WAL file size 1839501, number of live WAL files 2. Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.513600) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373633' seq:72057594037927935, type:22 .. '6C6F676D0034303134' seq:0, type:0; will stop at (end) Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1733KB)], [45(20MB)] Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656513660, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 23504511, "oldest_snapshot_seqno": -1} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12520 keys, 23327551 bytes, temperature: kUnknown Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656618020, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 23327551, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 23252957, "index_size": 42173, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31365, "raw_key_size": 336633, "raw_average_key_size": 26, "raw_value_size": 23036510, "raw_average_value_size": 1839, "num_data_blocks": 1607, "num_entries": 12520, "num_filter_entries": 12520, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939656, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.618387) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 23327551 bytes Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.620035) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 224.9 rd, 223.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 20.7 +0.0 blob) out(22.2 +0.0 blob), read-write-amplify(26.4) write-amplify(13.1) OK, records in: 13058, records dropped: 538 output_compression: NoCompression Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.620065) EVENT_LOG_v1 {"time_micros": 1769939656620052, "job": 26, "event": "compaction_finished", "compaction_time_micros": 104495, "compaction_time_cpu_micros": 57822, "output_level": 6, "num_output_files": 1, "total_output_size": 23327551, "num_input_records": 13058, "num_output_records": 12520, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656620442, "job": 26, "event": "table_file_deletion", "file_number": 47} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656623481, "job": 26, "event": "table_file_deletion", "file_number": 45} Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.513488) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.623544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.623551) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.623555) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.623559) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:16 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:16.623563) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:17.022 2 INFO neutron.agent.securitygroups_rpc [None req-f508e6c2-9093-4f47-a287-c55eb4d8e7d1 ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group rule updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']#033[00m Feb 1 04:54:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:17.262 2 INFO neutron.agent.securitygroups_rpc [None req-f0037227-79b1-4433-9269-e9d8a6c269aa ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group rule updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']#033[00m Feb 1 04:54:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e114 do_prune osdmap full prune enabled Feb 1 04:54:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e115 e115: 6 total, 6 up, 6 in Feb 1 04:54:17 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e115: 6 total, 6 up, 6 in Feb 1 04:54:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e115 do_prune osdmap full prune enabled Feb 1 04:54:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e116 e116: 6 total, 6 up, 6 in Feb 1 04:54:18 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e116: 6 total, 6 up, 6 in Feb 1 04:54:19 localhost nova_compute[274651]: 2026-02-01 09:54:19.405 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:54:19 localhost nova_compute[274651]: 2026-02-01 09:54:19.407 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:54:19 localhost nova_compute[274651]: 2026-02-01 09:54:19.407 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:54:19 localhost nova_compute[274651]: 2026-02-01 09:54:19.407 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:54:19 localhost nova_compute[274651]: 2026-02-01 09:54:19.429 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:19 localhost nova_compute[274651]: 2026-02-01 09:54:19.430 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:54:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e116 do_prune osdmap full prune enabled Feb 1 04:54:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e117 e117: 6 total, 6 up, 6 in Feb 1 04:54:19 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e117: 6 total, 6 up, 6 in Feb 1 04:54:20 localhost ovn_controller[152492]: 2026-02-01T09:54:20Z|00122|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:21 localhost nova_compute[274651]: 2026-02-01 09:54:21.018 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:21 localhost systemd[1]: tmp-crun.MxzrEX.mount: Deactivated successfully. Feb 1 04:54:21 localhost dnsmasq[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/addn_hosts - 0 addresses Feb 1 04:54:21 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/host Feb 1 04:54:21 localhost dnsmasq-dhcp[307652]: read /var/lib/neutron/dhcp/f09ff8e4-7935-4b27-a064-f09df44d21eb/opts Feb 1 04:54:21 localhost podman[307972]: 2026-02-01 09:54:21.149381368 +0000 UTC m=+0.064456149 container kill 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:54:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:54:21 localhost podman[307985]: 2026-02-01 09:54:21.286253789 +0000 UTC m=+0.106470554 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:21 localhost podman[307985]: 2026-02-01 09:54:21.301438857 +0000 UTC m=+0.121655662 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 1 04:54:21 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:54:21 localhost nova_compute[274651]: 2026-02-01 09:54:21.328 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:21 localhost ovn_controller[152492]: 2026-02-01T09:54:21Z|00123|binding|INFO|Releasing lport 821a8db1-f47d-477c-9de9-2bfb1d305a52 from this chassis (sb_readonly=0) Feb 1 04:54:21 localhost ovn_controller[152492]: 2026-02-01T09:54:21Z|00124|binding|INFO|Setting lport 821a8db1-f47d-477c-9de9-2bfb1d305a52 down in Southbound Feb 1 04:54:21 localhost kernel: device tap821a8db1-f4 left promiscuous mode Feb 1 04:54:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:21.339 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-f09ff8e4-7935-4b27-a064-f09df44d21eb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f09ff8e4-7935-4b27-a064-f09df44d21eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0194caf1b6343f4859fdcc75c872cf3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9d89f4f-557d-47b9-b228-682369fce5c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=821a8db1-f47d-477c-9de9-2bfb1d305a52) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:21.341 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 821a8db1-f47d-477c-9de9-2bfb1d305a52 in datapath f09ff8e4-7935-4b27-a064-f09df44d21eb unbound from our chassis#033[00m Feb 1 04:54:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:21.344 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f09ff8e4-7935-4b27-a064-f09df44d21eb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:54:21 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:21.345 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[b30ee67f-74b3-42fd-b72c-5886e43144a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:21 localhost nova_compute[274651]: 2026-02-01 09:54:21.356 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:22 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:22.456 2 INFO neutron.agent.securitygroups_rpc [req-83e0e8cb-3429-4ade-bafe-f7d6f9e3d311 req-5bc5dc3b-802b-43fa-a784-b37e50cbe40a ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group member updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']#033[00m Feb 1 04:54:22 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:22.513 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:21Z, description=, device_id=f05b4710-b36a-4606-8b43-78cd21362e41, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=fde0f0cb-babf-48ed-bf15-b99b448e8039, ip_allocation=immediate, mac_address=fa:16:3e:92:bd:d9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:05Z, description=, dns_domain=, id=5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-676799361-network, port_security_enabled=True, project_id=9bbefd3c06294b7fa7720ba6ca48fa4b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63331, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1040, status=ACTIVE, subnets=['a01ac7ec-edd9-46ed-82e6-8213d1b3e830'], tags=[], tenant_id=9bbefd3c06294b7fa7720ba6ca48fa4b, updated_at=2026-02-01T09:54:06Z, vlan_transparent=None, network_id=5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, port_security_enabled=True, project_id=9bbefd3c06294b7fa7720ba6ca48fa4b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d6a2366a-be19-483b-bd9c-86227fb6f0c8'], standard_attr_id=1157, status=DOWN, tags=[], tenant_id=9bbefd3c06294b7fa7720ba6ca48fa4b, updated_at=2026-02-01T09:54:22Z on network 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da#033[00m Feb 1 04:54:22 localhost dnsmasq[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/addn_hosts - 2 addresses Feb 1 04:54:22 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/host Feb 1 04:54:22 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/opts Feb 1 04:54:22 localhost podman[308030]: 2026-02-01 09:54:22.77214706 +0000 UTC m=+0.073945571 container kill 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:23.012 259320 INFO neutron.agent.dhcp.agent [None req-8c161805-99aa-44c3-874c-37e0792363c2 - - - - - -] DHCP configuration for ports {'fde0f0cb-babf-48ed-bf15-b99b448e8039'} is completed#033[00m Feb 1 04:54:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:23.402 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005604213.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:21Z, description=, device_id=f05b4710-b36a-4606-8b43-78cd21362e41, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[], id=fde0f0cb-babf-48ed-bf15-b99b448e8039, ip_allocation=immediate, mac_address=fa:16:3e:92:bd:d9, name=, network_id=5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, port_security_enabled=True, project_id=9bbefd3c06294b7fa7720ba6ca48fa4b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['d6a2366a-be19-483b-bd9c-86227fb6f0c8'], standard_attr_id=1157, status=DOWN, tags=[], tenant_id=9bbefd3c06294b7fa7720ba6ca48fa4b, updated_at=2026-02-01T09:54:23Z on network 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da#033[00m Feb 1 04:54:23 localhost dnsmasq[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/addn_hosts - 2 addresses Feb 1 04:54:23 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/host Feb 1 04:54:23 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/opts Feb 1 04:54:23 localhost podman[308070]: 2026-02-01 09:54:23.663098504 +0000 UTC m=+0.066585383 container kill 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:54:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:23.835 259320 INFO neutron.agent.dhcp.agent [None req-c788a684-797f-4bfd-b26d-297dedecc303 - - - - - -] DHCP configuration for ports {'fde0f0cb-babf-48ed-bf15-b99b448e8039'} is completed#033[00m Feb 1 04:54:23 localhost podman[236886]: time="2026-02-01T09:54:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:54:23 localhost podman[236886]: @ - - [01/Feb/2026:09:54:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160184 "" "Go-http-client/1.1" Feb 1 04:54:24 localhost podman[236886]: @ - - [01/Feb/2026:09:54:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19762 "" "Go-http-client/1.1" Feb 1 04:54:24 localhost nova_compute[274651]: 2026-02-01 09:54:24.463 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:24 localhost ovn_controller[152492]: 2026-02-01T09:54:24Z|00125|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:24 localhost nova_compute[274651]: 2026-02-01 09:54:24.763 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e117 do_prune osdmap full prune enabled Feb 1 04:54:25 localhost dnsmasq[307652]: exiting on receipt of SIGTERM Feb 1 04:54:25 localhost systemd[1]: tmp-crun.jG7BcA.mount: Deactivated successfully. Feb 1 04:54:25 localhost podman[308107]: 2026-02-01 09:54:25.314767128 +0000 UTC m=+0.074263891 container kill 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:54:25 localhost systemd[1]: libpod-26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206.scope: Deactivated successfully. Feb 1 04:54:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e118 e118: 6 total, 6 up, 6 in Feb 1 04:54:25 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e118: 6 total, 6 up, 6 in Feb 1 04:54:25 localhost podman[308119]: 2026-02-01 09:54:25.400519472 +0000 UTC m=+0.072855258 container died 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:54:25 localhost podman[308119]: 2026-02-01 09:54:25.426098332 +0000 UTC m=+0.098434098 container cleanup 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:54:25 localhost systemd[1]: libpod-conmon-26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206.scope: Deactivated successfully. Feb 1 04:54:25 localhost podman[308121]: 2026-02-01 09:54:25.476481486 +0000 UTC m=+0.139106512 container remove 26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f09ff8e4-7935-4b27-a064-f09df44d21eb, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:54:25 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:25.500 259320 INFO neutron.agent.dhcp.agent [None req-9342c57b-9654-4ef4-afc6-4546cd674b4e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:25 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:25.573 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:26 localhost systemd[1]: var-lib-containers-storage-overlay-e02ff2903735248fe447cf43dddd9e6d237884824be2e4ad83381a8f0055c086-merged.mount: Deactivated successfully. Feb 1 04:54:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-26e33fa70a3a78dd6f9e89a929ec3994d05b06103f5595a87673f065d8bfe206-userdata-shm.mount: Deactivated successfully. Feb 1 04:54:26 localhost systemd[1]: run-netns-qdhcp\x2df09ff8e4\x2d7935\x2d4b27\x2da064\x2df09df44d21eb.mount: Deactivated successfully. Feb 1 04:54:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e118 do_prune osdmap full prune enabled Feb 1 04:54:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e119 e119: 6 total, 6 up, 6 in Feb 1 04:54:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e119: 6 total, 6 up, 6 in Feb 1 04:54:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:28.516 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:28 localhost nova_compute[274651]: 2026-02-01 09:54:28.517 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:28.518 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:54:28 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:28.730 2 INFO neutron.agent.securitygroups_rpc [None req-ff9f5f38-da01-49cd-ad4a-92231356a657 ff147cab913d4d439b1d697fdf7e96ba dd3a0e574d0f493cafe8d66c78341de5 - - default default] Security group member updated ['39ab8694-6bb0-4b5a-b2c8-cff6705213f5']#033[00m Feb 1 04:54:29 localhost sshd[308147]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:54:29 localhost nova_compute[274651]: 2026-02-01 09:54:29.467 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:29.520 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:54:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:54:29 localhost podman[308149]: 2026-02-01 09:54:29.896087325 +0000 UTC m=+0.091944667 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:54:29 localhost podman[308149]: 2026-02-01 09:54:29.907343632 +0000 UTC m=+0.103200964 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:54:29 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:54:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:30.826 259320 INFO neutron.agent.linux.ip_lib [None req-c5847403-4de3-4447-869e-37b8849b3c16 - - - - - -] Device tapac95bc5f-08 cannot be used as it has no MAC address#033[00m Feb 1 04:54:30 localhost nova_compute[274651]: 2026-02-01 09:54:30.885 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:30 localhost kernel: device tapac95bc5f-08 entered promiscuous mode Feb 1 04:54:30 localhost NetworkManager[5964]: [1769939670.8955] manager: (tapac95bc5f-08): new Generic device (/org/freedesktop/NetworkManager/Devices/26) Feb 1 04:54:30 localhost ovn_controller[152492]: 2026-02-01T09:54:30Z|00126|binding|INFO|Claiming lport ac95bc5f-0838-406d-a323-b02120e15093 for this chassis. Feb 1 04:54:30 localhost ovn_controller[152492]: 2026-02-01T09:54:30Z|00127|binding|INFO|ac95bc5f-0838-406d-a323-b02120e15093: Claiming unknown Feb 1 04:54:30 localhost nova_compute[274651]: 2026-02-01 09:54:30.897 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:30 localhost systemd-udevd[308182]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost ovn_controller[152492]: 2026-02-01T09:54:30Z|00128|binding|INFO|Setting lport ac95bc5f-0838-406d-a323-b02120e15093 ovn-installed in OVS Feb 1 04:54:30 localhost nova_compute[274651]: 2026-02-01 09:54:30.937 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost journal[217584]: ethtool ioctl error on tapac95bc5f-08: No such device Feb 1 04:54:30 localhost nova_compute[274651]: 2026-02-01 09:54:30.961 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:30 localhost nova_compute[274651]: 2026-02-01 09:54:30.983 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:31 localhost ovn_controller[152492]: 2026-02-01T09:54:31Z|00129|binding|INFO|Setting lport ac95bc5f-0838-406d-a323-b02120e15093 up in Southbound Feb 1 04:54:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:31.373 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-50f93981-0358-491d-ac89-14bb01232f78', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50f93981-0358-491d-ac89-14bb01232f78', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f43a480966b7401fb7f10a27df6595f8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5aba3620-fc76-4227-aa32-8dcddb17f59a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ac95bc5f-0838-406d-a323-b02120e15093) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:31.375 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ac95bc5f-0838-406d-a323-b02120e15093 in datapath 50f93981-0358-491d-ac89-14bb01232f78 bound to our chassis#033[00m Feb 1 04:54:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:31.376 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 50f93981-0358-491d-ac89-14bb01232f78 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:54:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:31.378 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[55e193e9-37cb-47ec-a0b7-f1fb73b5d138]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:31 localhost openstack_network_exporter[239441]: ERROR 09:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:54:31 localhost openstack_network_exporter[239441]: Feb 1 04:54:31 localhost openstack_network_exporter[239441]: ERROR 09:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:54:31 localhost openstack_network_exporter[239441]: Feb 1 04:54:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:31 localhost podman[308254]: Feb 1 04:54:31 localhost podman[308254]: 2026-02-01 09:54:31.889331392 +0000 UTC m=+0.087015304 container create 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:54:31 localhost systemd[1]: Started libpod-conmon-25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e.scope. Feb 1 04:54:31 localhost podman[308254]: 2026-02-01 09:54:31.85228597 +0000 UTC m=+0.049969932 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:54:31 localhost systemd[1]: Started libcrun container. Feb 1 04:54:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b591ce9fbe542285f2f68aa737b7cdd5cd80266f092ce4a8c4328cdcb47ccc61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:54:32 localhost podman[308254]: 2026-02-01 09:54:31.967493032 +0000 UTC m=+0.165177004 container init 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:32 localhost podman[308254]: 2026-02-01 09:54:32.00895656 +0000 UTC m=+0.206640472 container start 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:54:32 localhost dnsmasq[308273]: started, version 2.85 cachesize 150 Feb 1 04:54:32 localhost dnsmasq[308273]: DNS service limited to local subnets Feb 1 04:54:32 localhost dnsmasq[308273]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:54:32 localhost dnsmasq[308273]: warning: no upstream servers configured Feb 1 04:54:32 localhost dnsmasq-dhcp[308273]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:54:32 localhost dnsmasq[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/addn_hosts - 0 addresses Feb 1 04:54:32 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/host Feb 1 04:54:32 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/opts Feb 1 04:54:32 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:32.135 259320 INFO neutron.agent.dhcp.agent [None req-c5ade55d-ea1c-4bcc-86ba-06a5b475f20e - - - - - -] DHCP configuration for ports {'6572c5b8-d0c9-42f4-83cd-d3984bf62abd'} is completed#033[00m Feb 1 04:54:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e119 do_prune osdmap full prune enabled Feb 1 04:54:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e120 e120: 6 total, 6 up, 6 in Feb 1 04:54:32 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e120: 6 total, 6 up, 6 in Feb 1 04:54:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:54:32 localhost podman[308274]: 2026-02-01 09:54:32.975752314 +0000 UTC m=+0.084426995 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Feb 1 04:54:32 localhost podman[308274]: 2026-02-01 09:54:32.981659586 +0000 UTC m=+0.090334267 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:32 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:54:33 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:33.569 2 INFO neutron.agent.securitygroups_rpc [None req-9125c9fe-67c0-46c6-98f6-2771b3ce7427 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']#033[00m Feb 1 04:54:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:34.181 259320 INFO neutron.agent.linux.ip_lib [None req-88038b90-1fca-403a-a88d-e3969fbed2c1 - - - - - -] Device tapec892af7-6a cannot be used as it has no MAC address#033[00m Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.247 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost kernel: device tapec892af7-6a entered promiscuous mode Feb 1 04:54:34 localhost NetworkManager[5964]: [1769939674.2567] manager: (tapec892af7-6a): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Feb 1 04:54:34 localhost ovn_controller[152492]: 2026-02-01T09:54:34Z|00130|binding|INFO|Claiming lport ec892af7-6a69-42e3-9f0b-f8eeb5ba023d for this chassis. Feb 1 04:54:34 localhost ovn_controller[152492]: 2026-02-01T09:54:34Z|00131|binding|INFO|ec892af7-6a69-42e3-9f0b-f8eeb5ba023d: Claiming unknown Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.257 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.261 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost systemd-udevd[308370]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:54:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:34.270 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-6bcccdce-257c-48e2-b0c4-b30adbcc7f39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6bcccdce-257c-48e2-b0c4-b30adbcc7f39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7b25cdb96bed441fa12160a57bca4d9c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7d42fb8-4b21-4353-96ab-58c9ce0ec3a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ec892af7-6a69-42e3-9f0b-f8eeb5ba023d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:34.273 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ec892af7-6a69-42e3-9f0b-f8eeb5ba023d in datapath 6bcccdce-257c-48e2-b0c4-b30adbcc7f39 bound to our chassis#033[00m Feb 1 04:54:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:54:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:34.282 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 206b1cbd-cfb7-49b4-8503-140fff150d05 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:54:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:34.282 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6bcccdce-257c-48e2-b0c4-b30adbcc7f39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:34.283 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[25847a60-8850-484d-8e40-722aa177fe9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.287 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost ovn_controller[152492]: 2026-02-01T09:54:34Z|00132|binding|INFO|Setting lport ec892af7-6a69-42e3-9f0b-f8eeb5ba023d ovn-installed in OVS Feb 1 04:54:34 localhost ovn_controller[152492]: 2026-02-01T09:54:34Z|00133|binding|INFO|Setting lport ec892af7-6a69-42e3-9f0b-f8eeb5ba023d up in Southbound Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.289 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost journal[217584]: ethtool ioctl error on tapec892af7-6a: No such device Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.321 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.347 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost nova_compute[274651]: 2026-02-01 09:54:34.472 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e120 do_prune osdmap full prune enabled Feb 1 04:54:34 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:54:34 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:54:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e121 e121: 6 total, 6 up, 6 in Feb 1 04:54:34 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e121: 6 total, 6 up, 6 in Feb 1 04:54:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:54:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2444974192' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:54:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:54:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2444974192' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:54:34 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:34.700 2 INFO neutron.agent.securitygroups_rpc [None req-1d734dba-1bcf-45d6-b4fb-cb8bacf3e60d 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']#033[00m Feb 1 04:54:35 localhost podman[308459]: Feb 1 04:54:35 localhost podman[308459]: 2026-02-01 09:54:35.105946384 +0000 UTC m=+0.066673007 container create 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:54:35 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:35.137 2 INFO neutron.agent.securitygroups_rpc [None req-546fc9ec-f61e-4cb1-ba06-2baf55334087 ff147cab913d4d439b1d697fdf7e96ba dd3a0e574d0f493cafe8d66c78341de5 - - default default] Security group member updated ['39ab8694-6bb0-4b5a-b2c8-cff6705213f5']#033[00m Feb 1 04:54:35 localhost systemd[1]: Started libpod-conmon-300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579.scope. Feb 1 04:54:35 localhost systemd[1]: Started libcrun container. Feb 1 04:54:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a96cc7f3c61d2cf727818f661b4c52a634540ba058c7e9879a1966e0b853a056/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:54:35 localhost podman[308459]: 2026-02-01 09:54:35.170469944 +0000 UTC m=+0.131196557 container init 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:54:35 localhost podman[308459]: 2026-02-01 09:54:35.076365371 +0000 UTC m=+0.037091994 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:54:35 localhost podman[308459]: 2026-02-01 09:54:35.179518892 +0000 UTC m=+0.140245505 container start 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:54:35 localhost dnsmasq[308478]: started, version 2.85 cachesize 150 Feb 1 04:54:35 localhost dnsmasq[308478]: DNS service limited to local subnets Feb 1 04:54:35 localhost dnsmasq[308478]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:54:35 localhost dnsmasq[308478]: warning: no upstream servers configured Feb 1 04:54:35 localhost dnsmasq-dhcp[308478]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:54:35 localhost dnsmasq[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/addn_hosts - 0 addresses Feb 1 04:54:35 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/host Feb 1 04:54:35 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/opts Feb 1 04:54:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:35.244 259320 INFO neutron.agent.dhcp.agent [None req-f5d1d588-cd86-4534-83ef-018261bceeec - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:33Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=04c139aa-6be7-4e34-a074-25d7617b6d3a, ip_allocation=immediate, mac_address=fa:16:3e:39:b9:1d, name=tempest-ExtraDHCPOptionsTestJSON-663608563, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:31Z, description=, dns_domain=, id=6bcccdce-257c-48e2-b0c4-b30adbcc7f39, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-910879902, port_security_enabled=True, project_id=7b25cdb96bed441fa12160a57bca4d9c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37825, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1225, status=ACTIVE, subnets=['ed3750cc-736e-4999-9dcd-4a021606c944'], tags=[], tenant_id=7b25cdb96bed441fa12160a57bca4d9c, updated_at=2026-02-01T09:54:32Z, vlan_transparent=None, network_id=6bcccdce-257c-48e2-b0c4-b30adbcc7f39, port_security_enabled=True, project_id=7b25cdb96bed441fa12160a57bca4d9c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['e61e0f68-6135-4301-ab8c-68625c4e91d7'], standard_attr_id=1231, status=DOWN, tags=[], tenant_id=7b25cdb96bed441fa12160a57bca4d9c, updated_at=2026-02-01T09:54:33Z on network 6bcccdce-257c-48e2-b0c4-b30adbcc7f39#033[00m Feb 1 04:54:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:35.314 259320 INFO neutron.agent.dhcp.agent [None req-d766b8b3-0e4a-4829-929a-a6279112f6fb - - - - - -] DHCP configuration for ports {'0860892e-86b9-4609-a00d-23c271797267'} is completed#033[00m Feb 1 04:54:35 localhost dnsmasq[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/addn_hosts - 1 addresses Feb 1 04:54:35 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/host Feb 1 04:54:35 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/opts Feb 1 04:54:35 localhost podman[308494]: 2026-02-01 09:54:35.485217699 +0000 UTC m=+0.063555800 container kill 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e121 do_prune osdmap full prune enabled Feb 1 04:54:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e122 e122: 6 total, 6 up, 6 in Feb 1 04:54:35 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e122: 6 total, 6 up, 6 in Feb 1 04:54:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:35.637 259320 INFO neutron.agent.dhcp.agent [None req-92863558-d2d4-4e67-a3ee-2c7c605393f1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:34Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=86b10a16-f59d-4323-8e51-588c5672b9a6, ip_allocation=immediate, mac_address=fa:16:3e:a6:f7:f9, name=tempest-ExtraDHCPOptionsTestJSON-1848900089, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:31Z, description=, dns_domain=, id=6bcccdce-257c-48e2-b0c4-b30adbcc7f39, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-910879902, port_security_enabled=True, project_id=7b25cdb96bed441fa12160a57bca4d9c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37825, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1225, status=ACTIVE, subnets=['ed3750cc-736e-4999-9dcd-4a021606c944'], tags=[], tenant_id=7b25cdb96bed441fa12160a57bca4d9c, updated_at=2026-02-01T09:54:32Z, vlan_transparent=None, network_id=6bcccdce-257c-48e2-b0c4-b30adbcc7f39, port_security_enabled=True, project_id=7b25cdb96bed441fa12160a57bca4d9c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['e61e0f68-6135-4301-ab8c-68625c4e91d7'], standard_attr_id=1239, status=DOWN, tags=[], tenant_id=7b25cdb96bed441fa12160a57bca4d9c, updated_at=2026-02-01T09:54:34Z on network 6bcccdce-257c-48e2-b0c4-b30adbcc7f39#033[00m Feb 1 04:54:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:35.776 259320 INFO neutron.agent.dhcp.agent [None req-1d7c2c42-f1b6-44ff-9d3d-51f77a79d850 - - - - - -] DHCP configuration for ports {'04c139aa-6be7-4e34-a074-25d7617b6d3a'} is completed#033[00m Feb 1 04:54:35 localhost dnsmasq[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/addn_hosts - 2 addresses Feb 1 04:54:35 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/host Feb 1 04:54:35 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/opts Feb 1 04:54:35 localhost podman[308531]: 2026-02-01 09:54:35.928770968 +0000 UTC m=+0.061757095 container kill 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:54:35 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:35.949 2 INFO neutron.agent.securitygroups_rpc [None req-9bcabe14-1107-4613-a202-1866c6f3ee13 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']#033[00m Feb 1 04:54:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:36.128 259320 INFO neutron.agent.dhcp.agent [None req-ea95bd19-e1d5-401d-8266-8e6a3f16438b - - - - - -] DHCP configuration for ports {'86b10a16-f59d-4323-8e51-588c5672b9a6'} is completed#033[00m Feb 1 04:54:36 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:36.294 2 INFO neutron.agent.securitygroups_rpc [None req-7830ec90-7d5e-4488-a4fe-cfe1f6b35ae5 21d02ef23bf34fe3ad07a151844e8a84 7aa5c461f9764c8e9c6f7f88a3f3fe97 - - default default] Security group member updated ['a27a2b34-3872-4d18-89d2-71a867c33b37']#033[00m Feb 1 04:54:36 localhost dnsmasq[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/addn_hosts - 1 addresses Feb 1 04:54:36 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/host Feb 1 04:54:36 localhost systemd[1]: tmp-crun.IpbV27.mount: Deactivated successfully. Feb 1 04:54:36 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/opts Feb 1 04:54:36 localhost podman[308570]: 2026-02-01 09:54:36.306204877 +0000 UTC m=+0.067898615 container kill 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 1 04:54:36 localhost nova_compute[274651]: 2026-02-01 09:54:36.416 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:36.485 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:33Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=04c139aa-6be7-4e34-a074-25d7617b6d3a, ip_allocation=immediate, mac_address=fa:16:3e:39:b9:1d, name=tempest-new-port-name-1592501063, network_id=6bcccdce-257c-48e2-b0c4-b30adbcc7f39, port_security_enabled=True, project_id=7b25cdb96bed441fa12160a57bca4d9c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['e61e0f68-6135-4301-ab8c-68625c4e91d7'], standard_attr_id=1231, status=DOWN, tags=[], tenant_id=7b25cdb96bed441fa12160a57bca4d9c, updated_at=2026-02-01T09:54:36Z on network 6bcccdce-257c-48e2-b0c4-b30adbcc7f39#033[00m Feb 1 04:54:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:54:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:54:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:54:36 localhost dnsmasq[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/addn_hosts - 1 addresses Feb 1 04:54:36 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/host Feb 1 04:54:36 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/opts Feb 1 04:54:36 localhost podman[308607]: 2026-02-01 09:54:36.710717882 +0000 UTC m=+0.070667310 container kill 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 1 04:54:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:36.880 259320 INFO neutron.agent.dhcp.agent [None req-ad11746d-1bd5-4ff2-9dde-18171e205177 - - - - - -] DHCP configuration for ports {'04c139aa-6be7-4e34-a074-25d7617b6d3a'} is completed#033[00m Feb 1 04:54:37 localhost systemd[1]: tmp-crun.AoxKq4.mount: Deactivated successfully. Feb 1 04:54:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:37.317 2 INFO neutron.agent.securitygroups_rpc [None req-2518325d-e6ff-412d-931e-351c87841bd0 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']#033[00m Feb 1 04:54:37 localhost dnsmasq[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/addn_hosts - 0 addresses Feb 1 04:54:37 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/host Feb 1 04:54:37 localhost dnsmasq-dhcp[308478]: read /var/lib/neutron/dhcp/6bcccdce-257c-48e2-b0c4-b30adbcc7f39/opts Feb 1 04:54:37 localhost podman[308643]: 2026-02-01 09:54:37.568802902 +0000 UTC m=+0.067515772 container kill 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:54:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:54:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:54:37 localhost podman[308658]: 2026-02-01 09:54:37.675526253 +0000 UTC m=+0.081238586 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:54:37 localhost podman[308658]: 2026-02-01 09:54:37.682678554 +0000 UTC m=+0.088390967 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:54:37 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:54:37 localhost podman[308659]: 2026-02-01 09:54:37.724054901 +0000 UTC m=+0.129372812 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Feb 1 04:54:37 localhost podman[308659]: 2026-02-01 09:54:37.823655412 +0000 UTC m=+0.228973333 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:54:37 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:54:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:38.090 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:37Z, description=, device_id=b79340de-5b5a-4e42-ae91-ff0da2f8648d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e3a88a4-e301-485b-86ba-f2d031a7d5a4, ip_allocation=immediate, mac_address=fa:16:3e:2d:8d:7a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:28Z, description=, dns_domain=, id=50f93981-0358-491d-ac89-14bb01232f78, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-246477351-network, port_security_enabled=True, project_id=f43a480966b7401fb7f10a27df6595f8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30027, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1211, status=ACTIVE, subnets=['69a0b02b-4b85-4f31-a0af-cb312574fc98'], tags=[], tenant_id=f43a480966b7401fb7f10a27df6595f8, updated_at=2026-02-01T09:54:29Z, vlan_transparent=None, network_id=50f93981-0358-491d-ac89-14bb01232f78, port_security_enabled=False, project_id=f43a480966b7401fb7f10a27df6595f8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1257, status=DOWN, tags=[], tenant_id=f43a480966b7401fb7f10a27df6595f8, updated_at=2026-02-01T09:54:37Z on network 50f93981-0358-491d-ac89-14bb01232f78#033[00m Feb 1 04:54:38 localhost podman[308730]: 2026-02-01 09:54:38.3106567 +0000 UTC m=+0.064476140 container kill 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:54:38 localhost dnsmasq[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/addn_hosts - 1 addresses Feb 1 04:54:38 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/host Feb 1 04:54:38 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/opts Feb 1 04:54:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:38.629 259320 INFO neutron.agent.dhcp.agent [None req-eaf89c16-1e80-441d-82b9-4a7f0db5f9d1 - - - - - -] DHCP configuration for ports {'5e3a88a4-e301-485b-86ba-f2d031a7d5a4'} is completed#033[00m Feb 1 04:54:38 localhost dnsmasq[308478]: exiting on receipt of SIGTERM Feb 1 04:54:38 localhost systemd[1]: libpod-300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579.scope: Deactivated successfully. Feb 1 04:54:38 localhost podman[308768]: 2026-02-01 09:54:38.785140561 +0000 UTC m=+0.058975079 container kill 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:54:38 localhost podman[308782]: 2026-02-01 09:54:38.848415713 +0000 UTC m=+0.052986965 container died 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:54:38 localhost podman[308782]: 2026-02-01 09:54:38.890178241 +0000 UTC m=+0.094749423 container cleanup 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:38 localhost systemd[1]: libpod-conmon-300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579.scope: Deactivated successfully. Feb 1 04:54:38 localhost podman[308789]: 2026-02-01 09:54:38.935974463 +0000 UTC m=+0.127035428 container remove 300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6bcccdce-257c-48e2-b0c4-b30adbcc7f39, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 1 04:54:38 localhost nova_compute[274651]: 2026-02-01 09:54:38.981 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:38 localhost ovn_controller[152492]: 2026-02-01T09:54:38Z|00134|binding|INFO|Releasing lport ec892af7-6a69-42e3-9f0b-f8eeb5ba023d from this chassis (sb_readonly=0) Feb 1 04:54:38 localhost ovn_controller[152492]: 2026-02-01T09:54:38Z|00135|binding|INFO|Setting lport ec892af7-6a69-42e3-9f0b-f8eeb5ba023d down in Southbound Feb 1 04:54:38 localhost kernel: device tapec892af7-6a left promiscuous mode Feb 1 04:54:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:38.995 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-6bcccdce-257c-48e2-b0c4-b30adbcc7f39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6bcccdce-257c-48e2-b0c4-b30adbcc7f39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7b25cdb96bed441fa12160a57bca4d9c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7d42fb8-4b21-4353-96ab-58c9ce0ec3a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ec892af7-6a69-42e3-9f0b-f8eeb5ba023d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:38.997 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ec892af7-6a69-42e3-9f0b-f8eeb5ba023d in datapath 6bcccdce-257c-48e2-b0c4-b30adbcc7f39 unbound from our chassis#033[00m Feb 1 04:54:39 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:38.999 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6bcccdce-257c-48e2-b0c4-b30adbcc7f39, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:54:39 localhost nova_compute[274651]: 2026-02-01 09:54:39.003 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:39 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:39.001 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ed248cab-3f44-46a5-b4ca-0bb32825988b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:39 localhost systemd[1]: var-lib-containers-storage-overlay-a96cc7f3c61d2cf727818f661b4c52a634540ba058c7e9879a1966e0b853a056-merged.mount: Deactivated successfully. Feb 1 04:54:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-300c55c9de8858471cdf7710d65b2b9b5327f16b8a5cb5302a1572e1a7b19579-userdata-shm.mount: Deactivated successfully. Feb 1 04:54:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:39.190 259320 INFO neutron.agent.dhcp.agent [None req-65d81d87-6d03-4e64-8ed7-323f54ba3471 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:39.191 259320 INFO neutron.agent.dhcp.agent [None req-65d81d87-6d03-4e64-8ed7-323f54ba3471 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:39 localhost systemd[1]: run-netns-qdhcp\x2d6bcccdce\x2d257c\x2d48e2\x2db0c4\x2db30adbcc7f39.mount: Deactivated successfully. Feb 1 04:54:39 localhost nova_compute[274651]: 2026-02-01 09:54:39.474 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:39.573 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:40.195 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:37Z, description=, device_id=b79340de-5b5a-4e42-ae91-ff0da2f8648d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e3a88a4-e301-485b-86ba-f2d031a7d5a4, ip_allocation=immediate, mac_address=fa:16:3e:2d:8d:7a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:28Z, description=, dns_domain=, id=50f93981-0358-491d-ac89-14bb01232f78, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsNegativeTestJSON-246477351-network, port_security_enabled=True, project_id=f43a480966b7401fb7f10a27df6595f8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30027, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1211, status=ACTIVE, subnets=['69a0b02b-4b85-4f31-a0af-cb312574fc98'], tags=[], tenant_id=f43a480966b7401fb7f10a27df6595f8, updated_at=2026-02-01T09:54:29Z, vlan_transparent=None, network_id=50f93981-0358-491d-ac89-14bb01232f78, port_security_enabled=False, project_id=f43a480966b7401fb7f10a27df6595f8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1257, status=DOWN, tags=[], tenant_id=f43a480966b7401fb7f10a27df6595f8, updated_at=2026-02-01T09:54:37Z on network 50f93981-0358-491d-ac89-14bb01232f78#033[00m Feb 1 04:54:40 localhost ovn_controller[152492]: 2026-02-01T09:54:40Z|00136|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:40 localhost nova_compute[274651]: 2026-02-01 09:54:40.278 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:40 localhost dnsmasq[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/addn_hosts - 1 addresses Feb 1 04:54:40 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/host Feb 1 04:54:40 localhost podman[308829]: 2026-02-01 09:54:40.409238484 +0000 UTC m=+0.057428591 container kill 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:54:40 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/opts Feb 1 04:54:40 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:40.535 2 INFO neutron.agent.securitygroups_rpc [None req-515ecb9d-14c8-49e9-8847-f48fb7c12a8c 21d02ef23bf34fe3ad07a151844e8a84 7aa5c461f9764c8e9c6f7f88a3f3fe97 - - default default] Security group member updated ['a27a2b34-3872-4d18-89d2-71a867c33b37']#033[00m Feb 1 04:54:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:40.631 259320 INFO neutron.agent.dhcp.agent [None req-37117671-8ce4-4ff9-a528-7311ec2db0b4 - - - - - -] DHCP configuration for ports {'5e3a88a4-e301-485b-86ba-f2d031a7d5a4'} is completed#033[00m Feb 1 04:54:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e122 do_prune osdmap full prune enabled Feb 1 04:54:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 e123: 6 total, 6 up, 6 in Feb 1 04:54:41 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e123: 6 total, 6 up, 6 in Feb 1 04:54:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:41.716 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:54:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:41.717 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:54:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:41.717 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:54:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:42.240 2 INFO neutron.agent.securitygroups_rpc [None req-85d5be93-4f1a-4c18-9ce2-6a112b54530f 84f3db440e5d42c59396aab4e1ffcfd9 2a205e14a65e4950b2897f78a7089f09 - - default default] Security group member updated ['9edef165-badf-4d99-97d5-46869e0947c8']#033[00m Feb 1 04:54:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:42.561 2 INFO neutron.agent.securitygroups_rpc [None req-86fc0e8a-dc01-4804-b333-df33401eb55c ba01912592664d639fa7a27174068a0f a8a2395fa8604962aa6888633ff95bee - - default default] Security group member updated ['adcc453c-f15e-407c-b903-8df7ba9f8ef6']#033[00m Feb 1 04:54:43 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:43.108 2 INFO neutron.agent.securitygroups_rpc [None req-ea460609-a7b7-4b88-8971-9c496984f41d ba01912592664d639fa7a27174068a0f a8a2395fa8604962aa6888633ff95bee - - default default] Security group member updated ['adcc453c-f15e-407c-b903-8df7ba9f8ef6']#033[00m Feb 1 04:54:44 localhost sshd[308849]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:54:44 localhost sshd[308850]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:54:44 localhost nova_compute[274651]: 2026-02-01 09:54:44.498 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:44 localhost dnsmasq[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/addn_hosts - 0 addresses Feb 1 04:54:44 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/host Feb 1 04:54:44 localhost podman[308869]: 2026-02-01 09:54:44.614130772 +0000 UTC m=+0.063298533 container kill 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:54:44 localhost dnsmasq-dhcp[308273]: read /var/lib/neutron/dhcp/50f93981-0358-491d-ac89-14bb01232f78/opts Feb 1 04:54:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:54:44 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:44.656 2 INFO neutron.agent.securitygroups_rpc [None req-9845e2e7-e54c-48b9-9b8f-f8c7a4c52742 84f3db440e5d42c59396aab4e1ffcfd9 2a205e14a65e4950b2897f78a7089f09 - - default default] Security group member updated ['9edef165-badf-4d99-97d5-46869e0947c8']#033[00m Feb 1 04:54:44 localhost systemd[1]: tmp-crun.oYhI3v.mount: Deactivated successfully. Feb 1 04:54:44 localhost podman[308883]: 2026-02-01 09:54:44.723435223 +0000 UTC m=+0.084709283 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, version=9.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:54:44 localhost podman[308883]: 2026-02-01 09:54:44.73210156 +0000 UTC m=+0.093375600 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, version=9.7, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1769056855, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible) Feb 1 04:54:44 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:54:44 localhost ovn_controller[152492]: 2026-02-01T09:54:44Z|00137|binding|INFO|Releasing lport ac95bc5f-0838-406d-a323-b02120e15093 from this chassis (sb_readonly=0) Feb 1 04:54:44 localhost kernel: device tapac95bc5f-08 left promiscuous mode Feb 1 04:54:44 localhost ovn_controller[152492]: 2026-02-01T09:54:44Z|00138|binding|INFO|Setting lport ac95bc5f-0838-406d-a323-b02120e15093 down in Southbound Feb 1 04:54:44 localhost nova_compute[274651]: 2026-02-01 09:54:44.780 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:44.796 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-50f93981-0358-491d-ac89-14bb01232f78', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50f93981-0358-491d-ac89-14bb01232f78', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f43a480966b7401fb7f10a27df6595f8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5aba3620-fc76-4227-aa32-8dcddb17f59a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ac95bc5f-0838-406d-a323-b02120e15093) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:44.797 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ac95bc5f-0838-406d-a323-b02120e15093 in datapath 50f93981-0358-491d-ac89-14bb01232f78 unbound from our chassis#033[00m Feb 1 04:54:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:44.798 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50f93981-0358-491d-ac89-14bb01232f78, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:54:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:44.799 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f26ee88f-a11c-40fd-a9e1-ac98faf03fac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:44 localhost nova_compute[274651]: 2026-02-01 09:54:44.802 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:47 localhost ovn_controller[152492]: 2026-02-01T09:54:47Z|00139|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:47 localhost nova_compute[274651]: 2026-02-01 09:54:47.345 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:48 localhost dnsmasq[308273]: exiting on receipt of SIGTERM Feb 1 04:54:48 localhost podman[308930]: 2026-02-01 09:54:48.194592915 +0000 UTC m=+0.042642265 container kill 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:54:48 localhost systemd[1]: libpod-25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e.scope: Deactivated successfully. Feb 1 04:54:48 localhost podman[308944]: 2026-02-01 09:54:48.263959065 +0000 UTC m=+0.052809540 container died 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:48 localhost podman[308944]: 2026-02-01 09:54:48.296655873 +0000 UTC m=+0.085506318 container cleanup 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:54:48 localhost systemd[1]: libpod-conmon-25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e.scope: Deactivated successfully. Feb 1 04:54:48 localhost podman[308945]: 2026-02-01 09:54:48.337018248 +0000 UTC m=+0.121656493 container remove 25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-50f93981-0358-491d-ac89-14bb01232f78, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:48.647 259320 INFO neutron.agent.dhcp.agent [None req-587ff265-3669-4ad3-aced-d5e666a6ce9a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:48.647 259320 INFO neutron.agent.dhcp.agent [None req-587ff265-3669-4ad3-aced-d5e666a6ce9a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:48.936 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:54:49 localhost systemd[1]: var-lib-containers-storage-overlay-b591ce9fbe542285f2f68aa737b7cdd5cd80266f092ce4a8c4328cdcb47ccc61-merged.mount: Deactivated successfully. Feb 1 04:54:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25e1ab7fff543c5b51a304f0db30a90d2cddc571996f13e592aadbbfa9e4338e-userdata-shm.mount: Deactivated successfully. Feb 1 04:54:49 localhost systemd[1]: run-netns-qdhcp\x2d50f93981\x2d0358\x2d491d\x2dac89\x2d14bb01232f78.mount: Deactivated successfully. Feb 1 04:54:49 localhost nova_compute[274651]: 2026-02-01 09:54:49.532 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:50.664 2 INFO neutron.agent.securitygroups_rpc [None req-57c956cb-89d5-4885-9663-ca5823a12d21 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['6ebf4d70-9c5f-40a7-b43f-38d30ca97739']#033[00m Feb 1 04:54:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:50.755 259320 INFO neutron.agent.linux.ip_lib [None req-3fdf3c02-970d-4926-bdb4-e6df4d115a8e - - - - - -] Device tap10657996-a5 cannot be used as it has no MAC address#033[00m Feb 1 04:54:50 localhost nova_compute[274651]: 2026-02-01 09:54:50.816 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost kernel: device tap10657996-a5 entered promiscuous mode Feb 1 04:54:50 localhost NetworkManager[5964]: [1769939690.8274] manager: (tap10657996-a5): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Feb 1 04:54:50 localhost ovn_controller[152492]: 2026-02-01T09:54:50Z|00140|binding|INFO|Claiming lport 10657996-a59f-4b72-8f5c-91e52ba25fa3 for this chassis. Feb 1 04:54:50 localhost ovn_controller[152492]: 2026-02-01T09:54:50Z|00141|binding|INFO|10657996-a59f-4b72-8f5c-91e52ba25fa3: Claiming unknown Feb 1 04:54:50 localhost nova_compute[274651]: 2026-02-01 09:54:50.829 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost systemd-udevd[308980]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:54:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:50.839 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-8c8faa8a-12c5-4564-a503-67f1cd5faeef', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c8faa8a-12c5-4564-a503-67f1cd5faeef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3e5e9f4ac99471688f0279d307f2650', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6693a8e9-b58e-4136-abca-b8e21a6ccf20, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10657996-a59f-4b72-8f5c-91e52ba25fa3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:50.841 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 10657996-a59f-4b72-8f5c-91e52ba25fa3 in datapath 8c8faa8a-12c5-4564-a503-67f1cd5faeef bound to our chassis#033[00m Feb 1 04:54:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:50.843 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8c8faa8a-12c5-4564-a503-67f1cd5faeef or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:54:50 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:50.848 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[43c77ad5-966f-4059-a066-34fb8af19b22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost nova_compute[274651]: 2026-02-01 09:54:50.862 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost ovn_controller[152492]: 2026-02-01T09:54:50Z|00142|binding|INFO|Setting lport 10657996-a59f-4b72-8f5c-91e52ba25fa3 ovn-installed in OVS Feb 1 04:54:50 localhost ovn_controller[152492]: 2026-02-01T09:54:50Z|00143|binding|INFO|Setting lport 10657996-a59f-4b72-8f5c-91e52ba25fa3 up in Southbound Feb 1 04:54:50 localhost nova_compute[274651]: 2026-02-01 09:54:50.868 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost nova_compute[274651]: 2026-02-01 09:54:50.872 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost journal[217584]: ethtool ioctl error on tap10657996-a5: No such device Feb 1 04:54:50 localhost nova_compute[274651]: 2026-02-01 09:54:50.907 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost nova_compute[274651]: 2026-02-01 09:54:50.939 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:50.962 2 INFO neutron.agent.securitygroups_rpc [None req-ca712223-c062-400f-8ed8-8ff5e5903afc 306e307654cf41949f0bb118796a4bc7 8f87cde7f6eb4ef0beb13dc0679c10cb - - default default] Security group member updated ['a498609f-8637-4692-9d11-be96cabae719']#033[00m Feb 1 04:54:51 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:51.010 2 INFO neutron.agent.securitygroups_rpc [None req-c3b6808d-2668-4b98-8bd8-53b9c4ac7a7c 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['6ebf4d70-9c5f-40a7-b43f-38d30ca97739']#033[00m Feb 1 04:54:51 localhost nova_compute[274651]: 2026-02-01 09:54:51.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:51 localhost nova_compute[274651]: 2026-02-01 09:54:51.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:54:51 localhost nova_compute[274651]: 2026-02-01 09:54:51.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:54:51 localhost nova_compute[274651]: 2026-02-01 09:54:51.349 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:54:51 localhost nova_compute[274651]: 2026-02-01 09:54:51.349 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:54:51 localhost nova_compute[274651]: 2026-02-01 09:54:51.350 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:54:51 localhost nova_compute[274651]: 2026-02-01 09:54:51.350 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:54:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:54:51 localhost podman[309052]: 2026-02-01 09:54:51.717786921 +0000 UTC m=+0.073690983 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:54:51 localhost podman[309052]: 2026-02-01 09:54:51.733430223 +0000 UTC m=+0.089334275 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Feb 1 04:54:51 localhost podman[309051]: Feb 1 04:54:51 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:54:51 localhost podman[309051]: 2026-02-01 09:54:51.750550381 +0000 UTC m=+0.103318997 container create 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:54:51 localhost systemd[1]: Started libpod-conmon-639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63.scope. Feb 1 04:54:51 localhost systemd[1]: Started libcrun container. Feb 1 04:54:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa3020ea0f25d444609104a3dca12b8a143c33c20ea4de26517bfbf3244bb0f0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:54:51 localhost podman[309051]: 2026-02-01 09:54:51.803143192 +0000 UTC m=+0.155911838 container init 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:54:51 localhost podman[309051]: 2026-02-01 09:54:51.711127615 +0000 UTC m=+0.063896301 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:54:51 localhost podman[309051]: 2026-02-01 09:54:51.817297189 +0000 UTC m=+0.170065855 container start 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:54:51 localhost dnsmasq[309086]: started, version 2.85 cachesize 150 Feb 1 04:54:51 localhost dnsmasq[309086]: DNS service limited to local subnets Feb 1 04:54:51 localhost dnsmasq[309086]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:54:51 localhost dnsmasq[309086]: warning: no upstream servers configured Feb 1 04:54:51 localhost dnsmasq-dhcp[309086]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:54:51 localhost dnsmasq[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/addn_hosts - 0 addresses Feb 1 04:54:51 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/host Feb 1 04:54:51 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/opts Feb 1 04:54:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:51.938 259320 INFO neutron.agent.dhcp.agent [None req-cc3ab21c-1a26-4a0b-8773-2f00b3ba439e - - - - - -] DHCP configuration for ports {'4ece6232-0014-4bef-981e-5669f3b3e315'} is completed#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.033 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.048 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.049 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.050 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.050 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.050 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:54:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:52.094 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:51Z, description=, device_id=68d684c2-2d8e-49d4-b723-69387fb1e9d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=43d95ae3-08e7-4c09-9343-0e4b55bea93c, ip_allocation=immediate, mac_address=fa:16:3e:ae:27:4f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:46Z, description=, dns_domain=, id=8c8faa8a-12c5-4564-a503-67f1cd5faeef, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--555358662, port_security_enabled=True, project_id=b3e5e9f4ac99471688f0279d307f2650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38523, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1322, status=ACTIVE, subnets=['889634cc-93ab-4c05-8753-5cd031fa71bb'], tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:54:49Z, vlan_transparent=None, network_id=8c8faa8a-12c5-4564-a503-67f1cd5faeef, port_security_enabled=False, project_id=b3e5e9f4ac99471688f0279d307f2650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1374, status=DOWN, tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:54:51Z on network 8c8faa8a-12c5-4564-a503-67f1cd5faeef#033[00m Feb 1 04:54:52 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:52.134 2 INFO neutron.agent.securitygroups_rpc [None req-898aca50-e443-4f04-8633-193e8d5d70fe 306e307654cf41949f0bb118796a4bc7 8f87cde7f6eb4ef0beb13dc0679c10cb - - default default] Security group member updated ['a498609f-8637-4692-9d11-be96cabae719']#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:52 localhost nova_compute[274651]: 2026-02-01 09:54:52.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:52 localhost dnsmasq[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/addn_hosts - 1 addresses Feb 1 04:54:52 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/host Feb 1 04:54:52 localhost podman[309105]: 2026-02-01 09:54:52.300895222 +0000 UTC m=+0.063878681 container kill 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:54:52 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/opts Feb 1 04:54:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:52.481 259320 INFO neutron.agent.dhcp.agent [None req-9d3ca156-68cd-4418-8aa3-b2be856ec201 - - - - - -] DHCP configuration for ports {'43d95ae3-08e7-4c09-9343-0e4b55bea93c'} is completed#033[00m Feb 1 04:54:52 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:52.514 2 INFO neutron.agent.securitygroups_rpc [None req-194b0ee6-ff55-463b-b49c-e9e305c5f2ea 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:53 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:53.266 2 INFO neutron.agent.securitygroups_rpc [None req-40a7cf1e-2b3d-4cda-aef2-d6b58ad042f7 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:53 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:53.749 2 INFO neutron.agent.securitygroups_rpc [None req-e7666148-591c-4a9b-983e-c90f12ec30cc 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:53 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:53.829 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:51Z, description=, device_id=68d684c2-2d8e-49d4-b723-69387fb1e9d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=43d95ae3-08e7-4c09-9343-0e4b55bea93c, ip_allocation=immediate, mac_address=fa:16:3e:ae:27:4f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:46Z, description=, dns_domain=, id=8c8faa8a-12c5-4564-a503-67f1cd5faeef, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--555358662, port_security_enabled=True, project_id=b3e5e9f4ac99471688f0279d307f2650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38523, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1322, status=ACTIVE, subnets=['889634cc-93ab-4c05-8753-5cd031fa71bb'], tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:54:49Z, vlan_transparent=None, network_id=8c8faa8a-12c5-4564-a503-67f1cd5faeef, port_security_enabled=False, project_id=b3e5e9f4ac99471688f0279d307f2650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1374, status=DOWN, tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:54:51Z on network 8c8faa8a-12c5-4564-a503-67f1cd5faeef#033[00m Feb 1 04:54:53 localhost podman[236886]: time="2026-02-01T09:54:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:54:53 localhost podman[236886]: @ - - [01/Feb/2026:09:54:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160177 "" "Go-http-client/1.1" Feb 1 04:54:54 localhost podman[309145]: 2026-02-01 09:54:54.041161148 +0000 UTC m=+0.107598478 container kill 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:54:54 localhost dnsmasq[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/addn_hosts - 1 addresses Feb 1 04:54:54 localhost systemd[1]: tmp-crun.5lLaNf.mount: Deactivated successfully. Feb 1 04:54:54 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/host Feb 1 04:54:54 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/opts Feb 1 04:54:54 localhost podman[236886]: @ - - [01/Feb/2026:09:54:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19775 "" "Go-http-client/1.1" Feb 1 04:54:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:54.253 259320 INFO neutron.agent.dhcp.agent [None req-ff518679-4a6f-497b-8728-b9dbcd550900 - - - - - -] DHCP configuration for ports {'43d95ae3-08e7-4c09-9343-0e4b55bea93c'} is completed#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.293 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.295 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.295 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.366 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.538 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:54 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:54.705 2 INFO neutron.agent.securitygroups_rpc [req-a8ee260e-a71a-4341-959a-47320de8959d req-f1ea5bc8-d304-408a-99f9-104affd65e7e ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group member updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']#033[00m Feb 1 04:54:54 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:54:54 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2253095091' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:54:54 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:54.731 2 INFO neutron.agent.securitygroups_rpc [None req-7f906aa9-468b-48e8-aab4-90305509c943 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.741 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.809 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.810 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:54:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:54.927 259320 INFO neutron.agent.linux.ip_lib [None req-3117fe47-f948-4278-93b0-ee6eb55208b1 - - - - - -] Device tapfb6ad6ad-cf cannot be used as it has no MAC address#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.956 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:54 localhost kernel: device tapfb6ad6ad-cf entered promiscuous mode Feb 1 04:54:54 localhost NetworkManager[5964]: [1769939694.9641] manager: (tapfb6ad6ad-cf): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.965 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:54 localhost systemd-udevd[309220]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:54:54 localhost ovn_controller[152492]: 2026-02-01T09:54:54Z|00144|binding|INFO|Claiming lport fb6ad6ad-cf1f-46b0-af98-884a65e507fd for this chassis. Feb 1 04:54:54 localhost ovn_controller[152492]: 2026-02-01T09:54:54Z|00145|binding|INFO|fb6ad6ad-cf1f-46b0-af98-884a65e507fd: Claiming unknown Feb 1 04:54:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:54.976 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-81fbe19e-7eac-486e-8c87-3782dd1d1fd0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81fbe19e-7eac-486e-8c87-3782dd1d1fd0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cc24d83-8ee3-46de-b62f-4da03ddce516, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fb6ad6ad-cf1f-46b0-af98-884a65e507fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:54.977 158365 INFO neutron.agent.ovn.metadata.agent [-] Port fb6ad6ad-cf1f-46b0-af98-884a65e507fd in datapath 81fbe19e-7eac-486e-8c87-3782dd1d1fd0 bound to our chassis#033[00m Feb 1 04:54:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:54.978 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 81fbe19e-7eac-486e-8c87-3782dd1d1fd0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:54:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:54.979 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[3a12b035-2435-4872-bda5-0746054eb85f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:54 localhost ovn_controller[152492]: 2026-02-01T09:54:54Z|00146|binding|INFO|Setting lport fb6ad6ad-cf1f-46b0-af98-884a65e507fd ovn-installed in OVS Feb 1 04:54:54 localhost ovn_controller[152492]: 2026-02-01T09:54:54Z|00147|binding|INFO|Setting lport fb6ad6ad-cf1f-46b0-af98-884a65e507fd up in Southbound Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.986 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:54 localhost nova_compute[274651]: 2026-02-01 09:54:54.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.000 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:55 localhost dnsmasq[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/addn_hosts - 1 addresses Feb 1 04:54:55 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/host Feb 1 04:54:55 localhost podman[309212]: 2026-02-01 09:54:55.018346663 +0000 UTC m=+0.062225060 container kill 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 1 04:54:55 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/opts Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.035 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:55 localhost systemd[1]: tmp-crun.YBv7lF.mount: Deactivated successfully. Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.066 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.099 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.100 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11299MB free_disk=41.700313568115234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.100 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.101 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.163 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.163 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.163 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.201 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:54:55 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:55.534 2 INFO neutron.agent.securitygroups_rpc [None req-bc75b288-28fa-41e6-8b23-683fd10099a8 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.608967) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695609041, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 780, "num_deletes": 255, "total_data_size": 654578, "memory_usage": 669088, "flush_reason": "Manual Compaction"} Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695613758, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 634977, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28041, "largest_seqno": 28820, "table_properties": {"data_size": 631278, "index_size": 1490, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9337, "raw_average_key_size": 20, "raw_value_size": 623480, "raw_average_value_size": 1379, "num_data_blocks": 65, "num_entries": 452, "num_filter_entries": 452, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939656, "oldest_key_time": 1769939656, "file_creation_time": 1769939695, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 4790 microseconds, and 1612 cpu microseconds. Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.613785) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 634977 bytes OK Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.613798) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.616319) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.616330) EVENT_LOG_v1 {"time_micros": 1769939695616326, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.616345) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 650579, prev total WAL file size 650579, number of live WAL files 2. Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.617030) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(620KB)], [48(22MB)] Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695617117, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 23962528, "oldest_snapshot_seqno": -1} Feb 1 04:54:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:54:55 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/552575308' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.652 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.657 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.669 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.687 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:54:55 localhost nova_compute[274651]: 2026-02-01 09:54:55.687 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 12444 keys, 21349367 bytes, temperature: kUnknown Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695719294, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 21349367, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21276619, "index_size": 40519, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31173, "raw_key_size": 335614, "raw_average_key_size": 26, "raw_value_size": 21062764, "raw_average_value_size": 1692, "num_data_blocks": 1533, "num_entries": 12444, "num_filter_entries": 12444, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939695, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.719697) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 21349367 bytes Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.721212) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 234.4 rd, 208.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 22.2 +0.0 blob) out(20.4 +0.0 blob), read-write-amplify(71.4) write-amplify(33.6) OK, records in: 12972, records dropped: 528 output_compression: NoCompression Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.721244) EVENT_LOG_v1 {"time_micros": 1769939695721231, "job": 28, "event": "compaction_finished", "compaction_time_micros": 102250, "compaction_time_cpu_micros": 56214, "output_level": 6, "num_output_files": 1, "total_output_size": 21349367, "num_input_records": 12972, "num_output_records": 12444, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695721502, "job": 28, "event": "table_file_deletion", "file_number": 50} Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695724828, "job": 28, "event": "table_file_deletion", "file_number": 48} Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.616885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.725010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.725030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.725034) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.725036) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:55 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:55.725038) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:55 localhost podman[309311]: Feb 1 04:54:55 localhost podman[309311]: 2026-02-01 09:54:55.90256114 +0000 UTC m=+0.079612956 container create 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:54:55 localhost systemd[1]: Started libpod-conmon-77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64.scope. Feb 1 04:54:55 localhost podman[309311]: 2026-02-01 09:54:55.853360382 +0000 UTC m=+0.030412258 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:54:55 localhost systemd[1]: tmp-crun.jYkpHv.mount: Deactivated successfully. Feb 1 04:54:55 localhost systemd[1]: Started libcrun container. Feb 1 04:54:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5ded09ea65267f5efd95e9f0a38da1c623d4fd245a9f88a92aee1526f5b7f8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:54:55 localhost podman[309311]: 2026-02-01 09:54:55.98491918 +0000 UTC m=+0.161970986 container init 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:54:55 localhost podman[309311]: 2026-02-01 09:54:55.993894956 +0000 UTC m=+0.170946762 container start 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 1 04:54:55 localhost dnsmasq[309329]: started, version 2.85 cachesize 150 Feb 1 04:54:55 localhost dnsmasq[309329]: DNS service limited to local subnets Feb 1 04:54:55 localhost dnsmasq[309329]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:54:55 localhost dnsmasq[309329]: warning: no upstream servers configured Feb 1 04:54:55 localhost dnsmasq-dhcp[309329]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Feb 1 04:54:56 localhost dnsmasq[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/addn_hosts - 0 addresses Feb 1 04:54:56 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/host Feb 1 04:54:56 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/opts Feb 1 04:54:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:54:56.117 259320 INFO neutron.agent.dhcp.agent [None req-ab81cda6-b6f9-4f8f-be33-138a7cf09c57 - - - - - -] DHCP configuration for ports {'8c048379-9594-44b9-b7ba-1a27a86fbd82'} is completed#033[00m Feb 1 04:54:56 localhost nova_compute[274651]: 2026-02-01 09:54:56.376 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:54:56 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:56.569 2 INFO neutron.agent.securitygroups_rpc [None req-56483b67-01a0-4213-8f99-ef04c4ba0846 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.701468) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696701551, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 256, "num_deletes": 251, "total_data_size": 14389, "memory_usage": 19960, "flush_reason": "Manual Compaction"} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696704315, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 13883, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28821, "largest_seqno": 29076, "table_properties": {"data_size": 12142, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 5145, "raw_average_key_size": 20, "raw_value_size": 8731, "raw_average_value_size": 34, "num_data_blocks": 2, "num_entries": 256, "num_filter_entries": 256, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939696, "oldest_key_time": 1769939696, "file_creation_time": 1769939696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 2914 microseconds, and 1081 cpu microseconds. Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.704372) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 13883 bytes OK Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.704412) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.706176) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.706199) EVENT_LOG_v1 {"time_micros": 1769939696706193, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.706221) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 12370, prev total WAL file size 12370, number of live WAL files 2. Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.706967) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373539' seq:72057594037927935, type:22 .. '6D6772737461740034303131' seq:0, type:0; will stop at (end) Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(13KB)], [51(20MB)] Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696707289, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 21363250, "oldest_snapshot_seqno": -1} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12194 keys, 19116594 bytes, temperature: kUnknown Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696776313, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 19116594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19050460, "index_size": 34545, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30533, "raw_key_size": 330613, "raw_average_key_size": 27, "raw_value_size": 18845856, "raw_average_value_size": 1545, "num_data_blocks": 1283, "num_entries": 12194, "num_filter_entries": 12194, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.776812) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 19116594 bytes Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.778918) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 309.1 rd, 276.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 20.4 +0.0 blob) out(18.2 +0.0 blob), read-write-amplify(2915.8) write-amplify(1377.0) OK, records in: 12700, records dropped: 506 output_compression: NoCompression Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.778953) EVENT_LOG_v1 {"time_micros": 1769939696778937, "job": 30, "event": "compaction_finished", "compaction_time_micros": 69105, "compaction_time_cpu_micros": 27726, "output_level": 6, "num_output_files": 1, "total_output_size": 19116594, "num_input_records": 12700, "num_output_records": 12194, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696779174, "job": 30, "event": "table_file_deletion", "file_number": 53} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696782776, "job": 30, "event": "table_file_deletion", "file_number": 51} Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.706879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.783020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.783035) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.783039) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.783042) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:56 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:54:56.783045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:54:57 localhost ovn_controller[152492]: 2026-02-01T09:54:57Z|00148|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:57 localhost nova_compute[274651]: 2026-02-01 09:54:57.277 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:57 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:57.717 2 INFO neutron.agent.securitygroups_rpc [None req-fccfae55-19b6-4730-a30c-150ca6a4b95d 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:57 localhost dnsmasq[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/addn_hosts - 0 addresses Feb 1 04:54:57 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/host Feb 1 04:54:57 localhost dnsmasq-dhcp[309086]: read /var/lib/neutron/dhcp/8c8faa8a-12c5-4564-a503-67f1cd5faeef/opts Feb 1 04:54:57 localhost podman[309347]: 2026-02-01 09:54:57.870394523 +0000 UTC m=+0.069247157 container kill 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:54:58 localhost ovn_controller[152492]: 2026-02-01T09:54:58Z|00149|binding|INFO|Releasing lport 10657996-a59f-4b72-8f5c-91e52ba25fa3 from this chassis (sb_readonly=0) Feb 1 04:54:58 localhost nova_compute[274651]: 2026-02-01 09:54:58.084 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:58 localhost ovn_controller[152492]: 2026-02-01T09:54:58Z|00150|binding|INFO|Setting lport 10657996-a59f-4b72-8f5c-91e52ba25fa3 down in Southbound Feb 1 04:54:58 localhost kernel: device tap10657996-a5 left promiscuous mode Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.093 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-8c8faa8a-12c5-4564-a503-67f1cd5faeef', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c8faa8a-12c5-4564-a503-67f1cd5faeef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3e5e9f4ac99471688f0279d307f2650', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6693a8e9-b58e-4136-abca-b8e21a6ccf20, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10657996-a59f-4b72-8f5c-91e52ba25fa3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.095 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 10657996-a59f-4b72-8f5c-91e52ba25fa3 in datapath 8c8faa8a-12c5-4564-a503-67f1cd5faeef unbound from our chassis#033[00m Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.096 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8c8faa8a-12c5-4564-a503-67f1cd5faeef or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.097 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[000e38b5-dcf2-45dc-b971-0d98851315c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:58 localhost nova_compute[274651]: 2026-02-01 09:54:58.110 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:58 localhost ovn_controller[152492]: 2026-02-01T09:54:58Z|00151|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:54:58 localhost nova_compute[274651]: 2026-02-01 09:54:58.265 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:58 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:58.459 2 INFO neutron.agent.securitygroups_rpc [None req-91b7ca55-4f70-425c-a0e2-0dabc032162c 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:58 localhost nova_compute[274651]: 2026-02-01 09:54:58.689 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:58 localhost nova_compute[274651]: 2026-02-01 09:54:58.690 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:54:58 localhost dnsmasq[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/addn_hosts - 0 addresses Feb 1 04:54:58 localhost podman[309389]: 2026-02-01 09:54:58.715102202 +0000 UTC m=+0.064318564 container kill 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:54:58 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/host Feb 1 04:54:58 localhost dnsmasq-dhcp[307823]: read /var/lib/neutron/dhcp/5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da/opts Feb 1 04:54:58 localhost nova_compute[274651]: 2026-02-01 09:54:58.881 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:58 localhost ovn_controller[152492]: 2026-02-01T09:54:58Z|00152|binding|INFO|Releasing lport ac0ff582-773f-444a-b9f8-7ad2e4c9a959 from this chassis (sb_readonly=0) Feb 1 04:54:58 localhost ovn_controller[152492]: 2026-02-01T09:54:58Z|00153|binding|INFO|Setting lport ac0ff582-773f-444a-b9f8-7ad2e4c9a959 down in Southbound Feb 1 04:54:58 localhost kernel: device tapac0ff582-77 left promiscuous mode Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.891 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9bbefd3c06294b7fa7720ba6ca48fa4b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3953cc01-ee9b-4241-8aee-2a63e36d4fe2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ac0ff582-773f-444a-b9f8-7ad2e4c9a959) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.893 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ac0ff582-773f-444a-b9f8-7ad2e4c9a959 in datapath 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da unbound from our chassis#033[00m Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.897 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:54:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:54:58.899 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[549a2134-1945-4b5f-91bd-7ae7aaff31ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:54:58 localhost nova_compute[274651]: 2026-02-01 09:54:58.901 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:59 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:59.110 2 INFO neutron.agent.securitygroups_rpc [None req-e31219af-a122-457f-883b-2593c5b9c745 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:54:59 localhost nova_compute[274651]: 2026-02-01 09:54:59.581 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:54:59 localhost neutron_sriov_agent[252126]: 2026-02-01 09:54:59.628 2 INFO neutron.agent.securitygroups_rpc [None req-72904d6f-3e0e-4e9a-a3b1-a7457a722d24 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']#033[00m Feb 1 04:55:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:55:00 localhost podman[309412]: 2026-02-01 09:55:00.722757592 +0000 UTC m=+0.084357862 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:55:00 localhost podman[309412]: 2026-02-01 09:55:00.731200182 +0000 UTC m=+0.092800512 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:55:00 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:55:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:00.748 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:00Z, description=, device_id=42b688b0-4c84-4fa7-8d5b-06392b34bb1a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=564499b5-aa19-426d-a4ee-89e026082a92, ip_allocation=immediate, mac_address=fa:16:3e:bd:40:c3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:51Z, description=, dns_domain=, id=81fbe19e-7eac-486e-8c87-3782dd1d1fd0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1356473311, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9990, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1370, status=ACTIVE, subnets=['d598145e-105e-402a-ab77-39907e40818f'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:54:53Z, vlan_transparent=None, network_id=81fbe19e-7eac-486e-8c87-3782dd1d1fd0, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1433, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:00Z on network 81fbe19e-7eac-486e-8c87-3782dd1d1fd0#033[00m Feb 1 04:55:00 localhost dnsmasq[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/addn_hosts - 1 addresses Feb 1 04:55:00 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/host Feb 1 04:55:00 localhost podman[309451]: 2026-02-01 09:55:00.927797115 +0000 UTC m=+0.068926406 container kill 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:55:00 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/opts Feb 1 04:55:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:00.982 259320 INFO neutron.agent.linux.ip_lib [None req-62bacaa8-cb39-4f15-a451-d5746893ad18 - - - - - -] Device tapfd17d6a1-28 cannot be used as it has no MAC address#033[00m Feb 1 04:55:01 localhost nova_compute[274651]: 2026-02-01 09:55:01.041 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:01 localhost kernel: device tapfd17d6a1-28 entered promiscuous mode Feb 1 04:55:01 localhost ovn_controller[152492]: 2026-02-01T09:55:01Z|00154|binding|INFO|Claiming lport fd17d6a1-286e-427c-a2ca-5c2026115837 for this chassis. Feb 1 04:55:01 localhost NetworkManager[5964]: [1769939701.0513] manager: (tapfd17d6a1-28): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Feb 1 04:55:01 localhost nova_compute[274651]: 2026-02-01 09:55:01.051 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:01 localhost ovn_controller[152492]: 2026-02-01T09:55:01Z|00155|binding|INFO|fd17d6a1-286e-427c-a2ca-5c2026115837: Claiming unknown Feb 1 04:55:01 localhost systemd-udevd[309480]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:01.074 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-aef4b48a-06a6-4982-a4fe-b9673d5748eb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef4b48a-06a6-4982-a4fe-b9673d5748eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3e5e9f4ac99471688f0279d307f2650', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4a73283-80c7-4d8e-b9ea-708ee54545e1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fd17d6a1-286e-427c-a2ca-5c2026115837) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:01.077 158365 INFO neutron.agent.ovn.metadata.agent [-] Port fd17d6a1-286e-427c-a2ca-5c2026115837 in datapath aef4b48a-06a6-4982-a4fe-b9673d5748eb bound to our chassis#033[00m Feb 1 04:55:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:01.079 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network aef4b48a-06a6-4982-a4fe-b9673d5748eb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:01.080 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ee42f183-a0e8-4ca7-8b4c-d5adf8f46ab4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:01 localhost ovn_controller[152492]: 2026-02-01T09:55:01Z|00156|binding|INFO|Setting lport fd17d6a1-286e-427c-a2ca-5c2026115837 ovn-installed in OVS Feb 1 04:55:01 localhost ovn_controller[152492]: 2026-02-01T09:55:01Z|00157|binding|INFO|Setting lport fd17d6a1-286e-427c-a2ca-5c2026115837 up in Southbound Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost nova_compute[274651]: 2026-02-01 09:55:01.082 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:01.115 2 INFO neutron.agent.securitygroups_rpc [None req-866dbbe7-cd9a-459b-9d5c-70660b94e103 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['e4f60f26-54df-4f21-8c82-cc76833023ab']#033[00m Feb 1 04:55:01 localhost journal[217584]: ethtool ioctl error on tapfd17d6a1-28: No such device Feb 1 04:55:01 localhost nova_compute[274651]: 2026-02-01 09:55:01.128 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:01 localhost nova_compute[274651]: 2026-02-01 09:55:01.154 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:01 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:01.182 259320 INFO neutron.agent.dhcp.agent [None req-c14ab658-954c-4810-8812-2ed3da3dcca2 - - - - - -] DHCP configuration for ports {'564499b5-aa19-426d-a4ee-89e026082a92'} is completed#033[00m Feb 1 04:55:01 localhost ovn_controller[152492]: 2026-02-01T09:55:01Z|00158|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:55:01 localhost nova_compute[274651]: 2026-02-01 09:55:01.367 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:01 localhost openstack_network_exporter[239441]: ERROR 09:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:55:01 localhost openstack_network_exporter[239441]: Feb 1 04:55:01 localhost openstack_network_exporter[239441]: ERROR 09:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:55:01 localhost openstack_network_exporter[239441]: Feb 1 04:55:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:01 localhost podman[309552]: Feb 1 04:55:01 localhost podman[309552]: 2026-02-01 09:55:01.988228567 +0000 UTC m=+0.087693535 container create a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aef4b48a-06a6-4982-a4fe-b9673d5748eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:02 localhost systemd[1]: Started libpod-conmon-a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694.scope. Feb 1 04:55:02 localhost podman[309552]: 2026-02-01 09:55:01.946781758 +0000 UTC m=+0.046246776 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:02 localhost systemd[1]: Started libcrun container. Feb 1 04:55:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d26b0b91625ee74a9e37631b18b408aa8bfb71ced56382d4f9a13a8e31ed29a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:02 localhost podman[309552]: 2026-02-01 09:55:02.068837633 +0000 UTC m=+0.168302631 container init a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aef4b48a-06a6-4982-a4fe-b9673d5748eb, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:55:02 localhost podman[309552]: 2026-02-01 09:55:02.07850392 +0000 UTC m=+0.177968908 container start a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aef4b48a-06a6-4982-a4fe-b9673d5748eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:55:02 localhost dnsmasq[309570]: started, version 2.85 cachesize 150 Feb 1 04:55:02 localhost dnsmasq[309570]: DNS service limited to local subnets Feb 1 04:55:02 localhost dnsmasq[309570]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:02 localhost dnsmasq[309570]: warning: no upstream servers configured Feb 1 04:55:02 localhost dnsmasq-dhcp[309570]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Feb 1 04:55:02 localhost dnsmasq[309570]: read /var/lib/neutron/dhcp/aef4b48a-06a6-4982-a4fe-b9673d5748eb/addn_hosts - 0 addresses Feb 1 04:55:02 localhost dnsmasq-dhcp[309570]: read /var/lib/neutron/dhcp/aef4b48a-06a6-4982-a4fe-b9673d5748eb/host Feb 1 04:55:02 localhost dnsmasq-dhcp[309570]: read /var/lib/neutron/dhcp/aef4b48a-06a6-4982-a4fe-b9673d5748eb/opts Feb 1 04:55:02 localhost dnsmasq[307823]: exiting on receipt of SIGTERM Feb 1 04:55:02 localhost podman[309588]: 2026-02-01 09:55:02.313204398 +0000 UTC m=+0.056134992 container kill 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:55:02 localhost systemd[1]: libpod-7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c.scope: Deactivated successfully. Feb 1 04:55:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:02.330 259320 INFO neutron.agent.dhcp.agent [None req-43d4c1fe-a1aa-404e-a9ba-273cc82c9623 - - - - - -] DHCP configuration for ports {'b6fafc94-8054-4403-a78c-6ae6df857649'} is completed#033[00m Feb 1 04:55:02 localhost podman[309608]: 2026-02-01 09:55:02.398443316 +0000 UTC m=+0.057467002 container died 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:55:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:02.461 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:00Z, description=, device_id=42b688b0-4c84-4fa7-8d5b-06392b34bb1a, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=564499b5-aa19-426d-a4ee-89e026082a92, ip_allocation=immediate, mac_address=fa:16:3e:bd:40:c3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:51Z, description=, dns_domain=, id=81fbe19e-7eac-486e-8c87-3782dd1d1fd0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1356473311, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9990, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1370, status=ACTIVE, subnets=['d598145e-105e-402a-ab77-39907e40818f'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:54:53Z, vlan_transparent=None, network_id=81fbe19e-7eac-486e-8c87-3782dd1d1fd0, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1433, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:00Z on network 81fbe19e-7eac-486e-8c87-3782dd1d1fd0#033[00m Feb 1 04:55:02 localhost podman[309608]: 2026-02-01 09:55:02.490396682 +0000 UTC m=+0.149420318 container remove 7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5da8c8ac-4ba7-4170-8dfe-e77ecb30b7da, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:55:02 localhost systemd[1]: libpod-conmon-7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c.scope: Deactivated successfully. Feb 1 04:55:02 localhost dnsmasq[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/addn_hosts - 1 addresses Feb 1 04:55:02 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/host Feb 1 04:55:02 localhost podman[309648]: 2026-02-01 09:55:02.64210809 +0000 UTC m=+0.047517436 container kill 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:02 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/opts Feb 1 04:55:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:02.724 259320 INFO neutron.agent.dhcp.agent [None req-1240a1e6-fdfe-4ea0-8c42-286adb294f6f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:02.884 259320 INFO neutron.agent.dhcp.agent [None req-4e13ad3b-3bbf-49c5-82fb-489bfad8559c - - - - - -] DHCP configuration for ports {'564499b5-aa19-426d-a4ee-89e026082a92'} is completed#033[00m Feb 1 04:55:02 localhost systemd[1]: var-lib-containers-storage-overlay-c7616f1935656f59b7721f88f7d77097d4a6e73c9a76d7c99371f9fc319b6e6c-merged.mount: Deactivated successfully. Feb 1 04:55:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b0b970f9c54f266f139d5c43422c4b1caf6c25537c8f30fbd197a10fe37be3c-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:02 localhost systemd[1]: run-netns-qdhcp\x2d5da8c8ac\x2d4ba7\x2d4170\x2d8dfe\x2de77ecb30b7da.mount: Deactivated successfully. Feb 1 04:55:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:03.193 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:03 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:03.357 2 INFO neutron.agent.securitygroups_rpc [None req-d4cc719f-d42b-446b-935d-536d497f9b87 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.529 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.534 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc48518b-83cc-41af-9e98-d10910319ce8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.530414', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '148929e8-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '542267020eca26857d572a03e0f7fd6457fda68754fa6616c3739e43e9570161'}]}, 'timestamp': '2026-02-01 09:55:03.535792', '_unique_id': 'c5cb1a46af7d4ab7af94a9415e753d87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.537 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.549 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.550 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '790832f6-54e0-4a7c-bb84-a87105561029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.538823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '148b6f1e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.758290954, 'message_signature': 'cdedf8244b849e172f381ba6f14369ded3d403dcf1798fc0acec091daaed25c5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.538823', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '148b827e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.758290954, 'message_signature': '9c6a57a45ff62fd0dfc9b52004abeeea0cc183889d117a63908e3c06599843f9'}]}, 'timestamp': '2026-02-01 09:55:03.551103', '_unique_id': '49b2c5e0c15149ba8a683a5b989c88e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.553 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a39ce13-3bd6-46b7-8f89-ba7b11f04644', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.553886', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '14902144-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': 'df2df447672cff0cdab89643e754874ad1b2ed82263273dd48fa75ef670b06ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.553886', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '14903e04-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': '4130fd6db3c6e6b050494a470d3cb88f96f15b5aa1f187f0a7855782dff1a952'}]}, 'timestamp': '2026-02-01 09:55:03.582112', '_unique_id': '0fefc9bb5c504269ab5332798538cdfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.585 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.585 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.585 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57fe2eab-ae84-4aae-b67f-02e650ef28dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.585256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1490cefa-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.758290954, 'message_signature': 'a7e4f5951f8121d3b773b86d35a68ec5dde3caa73799f45487d73e91c4821908'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.585256', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1490e296-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.758290954, 'message_signature': '9bb064b19560bcacee2303ede92f64a09a9c7e32daba4d26859b616eb3cdd565'}]}, 'timestamp': '2026-02-01 09:55:03.586317', '_unique_id': '5e9e4dad996741969687198a7b75ccd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.589 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05f7bc70-590f-4331-8631-c742ffd27d59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.589177', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '14916770-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '3405d2966e145f8f072c06a4727a8161f0617bc6b44fc5a95f6d0442134987a6'}]}, 'timestamp': '2026-02-01 09:55:03.589750', '_unique_id': 'ff76999507e148759d9156a636427e57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.592 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.592 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef24d908-c081-42e2-8c9a-8716c3bc5079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.592215', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1491de4e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': 'e58f2ff366f18386a6cad47a4e3b45397551ca43160700ed0ea7ea91dd068f18'}]}, 'timestamp': '2026-02-01 09:55:03.592775', '_unique_id': '03c5ee0ddad44e0d9d192dadc67f2c4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.595 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45224d7a-7816-4655-8680-0418940c34bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:55:03.595363', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1494a5e8-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.829475859, 'message_signature': '519dc0602a63e36cbd95cbf62083f905b56a259e348ab0487453dbe357a4a2c9'}]}, 'timestamp': '2026-02-01 09:55:03.611370', '_unique_id': '2fb937c006c240ccb01fbdb2a69a47a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost systemd-journald[47041]: Data hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Feb 1 04:55:03 localhost systemd-journald[47041]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.614 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.614 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebb360b9-d332-4fed-b051-222161ae8628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.614798', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '14954ff2-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '1d01ad113dfd270cc118392de47365f435ea3cb9d2497f0d23906a1b6da4be16'}]}, 'timestamp': '2026-02-01 09:55:03.615348', '_unique_id': 'fdcf6565eda24bce81ce6e4a4e74cd65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.618 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.619 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a77f50c-478b-4319-9ec8-6af9c92f8e11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.618967', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '1495f4de-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '05c40920248f290882a5a22524b910f8e63c2ba931ce804b1b30d20058e0a600'}]}, 'timestamp': '2026-02-01 09:55:03.619551', '_unique_id': 'beb6da2439204159a346ee2967b7823d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.621 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3911c76c-d3b7-4ddc-85f8-0821981be9d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.622092', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '14966cb6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': '192a82866f49f17986e6a288c569b47744f02ab83aa78f8a268b185524fb6da2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.622092', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '149683ea-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': 'aea68905079e6cb568cc905d2b0af8104f85db614531e3cffdfca9050e99f1df'}]}, 'timestamp': '2026-02-01 09:55:03.623243', '_unique_id': 'ec2d1b2459594676a152975e1df757e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.625 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.625 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '944ae851-c90d-46ce-9c02-d3811bb234c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.625695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1496f834-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.758290954, 'message_signature': '339e3c73be38094316dd79681dbc5e8d66ea0cefd4835116dee8c69eef294d4b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.625695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '14970a86-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.758290954, 'message_signature': 'ab5a5cddf24d7f4dbccbfcfe8904b55df1a2254c11f7664945b752fc30580e68'}]}, 'timestamp': '2026-02-01 09:55:03.626890', '_unique_id': 'bd777311af354145a59a31f85147a1bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.627 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.629 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60a12de0-04e6-45fa-9739-84aead122efd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.629292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '149785f6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': '5e340d0d2498452facf24e82fb6074a7578d8404f6fe72c9d2bdb6be76dc82ee'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.629292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '149798e8-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': '9b8ef3cb2189d0a82e89f4028e65b6f918131fc490e20108738c73ddce521941'}]}, 'timestamp': '2026-02-01 09:55:03.630268', '_unique_id': '12c85594ae8d47e8a6e36a14f8f1a72d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65c3183c-74ce-4522-9dcf-fb0019538f44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.632501', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '149801fc-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '64298c15b959aeacdf95d5ae4fb1364a401d9f8db7ee308e2e43061a035711f8'}]}, 'timestamp': '2026-02-01 09:55:03.633012', '_unique_id': 'c91de6fe2f15451ab3b5626cc62cbc69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7eeedfa-fd4e-489e-82bb-6d6adf546a6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.635312', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '14986f52-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '48f530e88b36465b20beffb97b8852b41806e7be4eb271f87424e5f8d5859944'}]}, 'timestamp': '2026-02-01 09:55:03.635809', '_unique_id': '07e6536cf9614d24bf7428b8a9c706fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.637 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64407837-8907-4dc3-a020-319bc95031c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.638087', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1498dc80-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': 'a1c7c7b32ceb78e8260d379164a983da07cc4c0e00760f544014ae61516d51d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.638087', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1498ed42-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': 'b98a889b3d70079fc37a4baa61d762eedbfda8a90b64c8dd474d7b09efe170a8'}]}, 'timestamp': '2026-02-01 09:55:03.639036', '_unique_id': '258f00b1872c48129cace5b7c2013067'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.641 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '811eec9c-d80e-4b9d-8f32-a104357c67ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.641407', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '14995f5c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': 'e992255ddf974f61bc7bb9bd5c94e71a27c0281826a299c0b506bd99868a5ef2'}]}, 'timestamp': '2026-02-01 09:55:03.641931', '_unique_id': '9f88110249214ab494d0d41d3910d3fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.642 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.644 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.644 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.644 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c6f9f98-7264-4700-8a54-c5400c4bb9bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.644213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1499cdac-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': 'a1b0cc47319681b23f1102e67e4a7853f4e20ff0a55a2121ff3c3b8c15b1bac8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.644213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1499de82-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': '6519d73fb266e1facd8e36ae055bd2363787a51f095c8e2d780fa9461ae8acaf'}]}, 'timestamp': '2026-02-01 09:55:03.645211', '_unique_id': '5ca8ed74aac14989b5cfe114099061a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.646 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.647 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.647 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c5099d3-19a6-4ff1-bed0-b9c731420f2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.647149', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '149a3d8c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '49a34e9e74cc853c5495ff312a1d91b6a034229060e52d7b58336ee21425dec9'}]}, 'timestamp': '2026-02-01 09:55:03.647602', '_unique_id': '8408903f163d4e6c999738adca4dff6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.649 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.649 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.650 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01c0c25d-10bd-4657-a958-5961ca2a31e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:55:03.649901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '149aacfe-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': '7ddb641817690206558fe28dd7bdee34f43f6290ac8aa152b0d71731ce3d0e7c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:55:03.649901', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '149abdac-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.773350038, 'message_signature': '745f9fe7b6704b4555c2efeb2e14c4146bbf254284dad1a789e27d0fdd7206a5'}]}, 'timestamp': '2026-02-01 09:55:03.650868', '_unique_id': 'ffd1e9d0ab164928951c388a6fe6603b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.652 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.652 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ace7b973-7e0d-4370-a56f-726998fe8e77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:55:03.652690', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '149b137e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.749858904, 'message_signature': '31738fab11dac7a6d6f0c49064abfbf26d810961fd6cfeec2a039520c2797f00'}]}, 'timestamp': '2026-02-01 09:55:03.653082', '_unique_id': 'af0c0b08741a457daa35904d1a745190'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.654 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.654 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.654 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 15610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e72bb980-6eac-4a9a-924b-e8e06beac292', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15610000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:55:03.654737', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '149b632e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11797.829475859, 'message_signature': 'acb5190dcb116d13d70a4bd26ff4054cd52bd7233cf94f2ba29b673fd8b343a1'}]}, 'timestamp': '2026-02-01 09:55:03.655072', '_unique_id': '9071d827dd084c2e859095fcb0eb366e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:55:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:55:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:55:03 localhost podman[309670]: 2026-02-01 09:55:03.710129636 +0000 UTC m=+0.069574816 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Feb 1 04:55:03 localhost podman[309670]: 2026-02-01 09:55:03.745311211 +0000 UTC m=+0.104756411 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:55:03 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:55:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:03.793 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:03 localhost dnsmasq[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/addn_hosts - 0 addresses Feb 1 04:55:03 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/host Feb 1 04:55:03 localhost podman[309705]: 2026-02-01 09:55:03.990043748 +0000 UTC m=+0.058888187 container kill 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:55:03 localhost dnsmasq-dhcp[309329]: read /var/lib/neutron/dhcp/81fbe19e-7eac-486e-8c87-3782dd1d1fd0/opts Feb 1 04:55:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:04.029 2 INFO neutron.agent.securitygroups_rpc [None req-a6c0da30-1e7d-4fdc-b34f-c8211b005180 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['dcd86290-3678-4dc4-8595-e876b5745966']#033[00m Feb 1 04:55:04 localhost nova_compute[274651]: 2026-02-01 09:55:04.156 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:04 localhost kernel: device tapfb6ad6ad-cf left promiscuous mode Feb 1 04:55:04 localhost ovn_controller[152492]: 2026-02-01T09:55:04Z|00159|binding|INFO|Releasing lport fb6ad6ad-cf1f-46b0-af98-884a65e507fd from this chassis (sb_readonly=0) Feb 1 04:55:04 localhost ovn_controller[152492]: 2026-02-01T09:55:04Z|00160|binding|INFO|Setting lport fb6ad6ad-cf1f-46b0-af98-884a65e507fd down in Southbound Feb 1 04:55:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:04.164 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-81fbe19e-7eac-486e-8c87-3782dd1d1fd0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-81fbe19e-7eac-486e-8c87-3782dd1d1fd0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cc24d83-8ee3-46de-b62f-4da03ddce516, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fb6ad6ad-cf1f-46b0-af98-884a65e507fd) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:04.165 158365 INFO neutron.agent.ovn.metadata.agent [-] Port fb6ad6ad-cf1f-46b0-af98-884a65e507fd in datapath 81fbe19e-7eac-486e-8c87-3782dd1d1fd0 unbound from our chassis#033[00m Feb 1 04:55:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:04.166 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 81fbe19e-7eac-486e-8c87-3782dd1d1fd0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:04.167 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[c94a7b3d-d232-4bff-833d-24e0d856f9de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:04 localhost nova_compute[274651]: 2026-02-01 09:55:04.179 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:04.584 2 INFO neutron.agent.securitygroups_rpc [None req-14af839d-45a9-4f98-b03e-7a019e6f639f 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['dcd86290-3678-4dc4-8595-e876b5745966']#033[00m Feb 1 04:55:04 localhost nova_compute[274651]: 2026-02-01 09:55:04.620 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:04.964 2 INFO neutron.agent.securitygroups_rpc [None req-e93a2aaf-9f80-41c1-a11d-131d09e57386 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:06 localhost systemd[1]: tmp-crun.z7L6VE.mount: Deactivated successfully. Feb 1 04:55:06 localhost dnsmasq[309329]: exiting on receipt of SIGTERM Feb 1 04:55:06 localhost podman[309746]: 2026-02-01 09:55:06.718314321 +0000 UTC m=+0.069203395 container kill 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:55:06 localhost systemd[1]: libpod-77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64.scope: Deactivated successfully. Feb 1 04:55:06 localhost podman[309760]: 2026-02-01 09:55:06.792269952 +0000 UTC m=+0.056360969 container died 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:06 localhost podman[309760]: 2026-02-01 09:55:06.823306289 +0000 UTC m=+0.087397256 container cleanup 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:06 localhost systemd[1]: libpod-conmon-77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64.scope: Deactivated successfully. Feb 1 04:55:06 localhost podman[309761]: 2026-02-01 09:55:06.883955009 +0000 UTC m=+0.141629299 container remove 77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-81fbe19e-7eac-486e-8c87-3782dd1d1fd0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:07 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:07.152 2 INFO neutron.agent.securitygroups_rpc [None req-943127f9-174a-469a-a7d0-e33db638b827 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:07 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:07.300 2 INFO neutron.agent.securitygroups_rpc [None req-eb9a1f3c-34ee-4016-ae11-84944f9bb005 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['a5467c7c-cb9b-4aeb-bb09-b5bf7707aed9']#033[00m Feb 1 04:55:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:07.374 259320 INFO neutron.agent.dhcp.agent [None req-91fdb3d7-8edb-450d-b285-84c0523acd45 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:07 localhost systemd[1]: var-lib-containers-storage-overlay-c5ded09ea65267f5efd95e9f0a38da1c623d4fd245a9f88a92aee1526f5b7f8a-merged.mount: Deactivated successfully. Feb 1 04:55:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77dc7a23d39a1e8c3680896b34b1ceea5f747db921728b706f8664e8deb35d64-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:07 localhost systemd[1]: run-netns-qdhcp\x2d81fbe19e\x2d7eac\x2d486e\x2d8c87\x2d3782dd1d1fd0.mount: Deactivated successfully. Feb 1 04:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:55:07 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:07.775 2 INFO neutron.agent.securitygroups_rpc [None req-7cd63175-19a6-47c4-a54b-1047c35ebff0 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['a5467c7c-cb9b-4aeb-bb09-b5bf7707aed9']#033[00m Feb 1 04:55:07 localhost podman[309788]: 2026-02-01 09:55:07.812107251 +0000 UTC m=+0.072544038 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:55:07 localhost podman[309788]: 2026-02-01 09:55:07.821182561 +0000 UTC m=+0.081619358 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:55:07 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:55:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:08.035 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:08 localhost dnsmasq[309570]: exiting on receipt of SIGTERM Feb 1 04:55:08 localhost podman[309827]: 2026-02-01 09:55:08.156556873 +0000 UTC m=+0.032847144 container kill a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aef4b48a-06a6-4982-a4fe-b9673d5748eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 1 04:55:08 localhost systemd[1]: libpod-a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694.scope: Deactivated successfully. Feb 1 04:55:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:55:08 localhost podman[309842]: 2026-02-01 09:55:08.209089554 +0000 UTC m=+0.038317093 container died a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aef4b48a-06a6-4982-a4fe-b9673d5748eb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:55:08 localhost podman[309853]: 2026-02-01 09:55:08.241187713 +0000 UTC m=+0.060273739 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 1 04:55:08 localhost podman[309853]: 2026-02-01 09:55:08.274534481 +0000 UTC m=+0.093620487 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:55:08 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:08.285 2 INFO neutron.agent.securitygroups_rpc [None req-d9ca590a-9c4b-410e-b353-3b648a203b3e cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:08 localhost podman[309842]: 2026-02-01 09:55:08.291602188 +0000 UTC m=+0.120829737 container remove a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aef4b48a-06a6-4982-a4fe-b9673d5748eb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:55:08 localhost systemd[1]: libpod-conmon-a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694.scope: Deactivated successfully. Feb 1 04:55:08 localhost nova_compute[274651]: 2026-02-01 09:55:08.304 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:08 localhost ovn_controller[152492]: 2026-02-01T09:55:08Z|00161|binding|INFO|Releasing lport fd17d6a1-286e-427c-a2ca-5c2026115837 from this chassis (sb_readonly=0) Feb 1 04:55:08 localhost ovn_controller[152492]: 2026-02-01T09:55:08Z|00162|binding|INFO|Setting lport fd17d6a1-286e-427c-a2ca-5c2026115837 down in Southbound Feb 1 04:55:08 localhost kernel: device tapfd17d6a1-28 left promiscuous mode Feb 1 04:55:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:08.320 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-aef4b48a-06a6-4982-a4fe-b9673d5748eb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aef4b48a-06a6-4982-a4fe-b9673d5748eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3e5e9f4ac99471688f0279d307f2650', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4a73283-80c7-4d8e-b9ea-708ee54545e1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fd17d6a1-286e-427c-a2ca-5c2026115837) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:08.322 158365 INFO neutron.agent.ovn.metadata.agent [-] Port fd17d6a1-286e-427c-a2ca-5c2026115837 in datapath aef4b48a-06a6-4982-a4fe-b9673d5748eb unbound from our chassis#033[00m Feb 1 04:55:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:08.323 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network aef4b48a-06a6-4982-a4fe-b9673d5748eb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:08.324 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8aecd4-ba4e-49d0-a969-cc7485866402]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:08 localhost nova_compute[274651]: 2026-02-01 09:55:08.325 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:08 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:55:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:08.571 259320 INFO neutron.agent.dhcp.agent [None req-077f5c7d-f3b2-4c1e-bc05-86f01333fe54 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:08.571 259320 INFO neutron.agent.dhcp.agent [None req-077f5c7d-f3b2-4c1e-bc05-86f01333fe54 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:08 localhost ovn_controller[152492]: 2026-02-01T09:55:08Z|00163|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:55:08 localhost nova_compute[274651]: 2026-02-01 09:55:08.613 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:08 localhost systemd[1]: var-lib-containers-storage-overlay-d26b0b91625ee74a9e37631b18b408aa8bfb71ced56382d4f9a13a8e31ed29a6-merged.mount: Deactivated successfully. Feb 1 04:55:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5f54d56d311e03b90901dc44fe5b8d1ce2923805335a2e948ea5054d303f694-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:08 localhost systemd[1]: run-netns-qdhcp\x2daef4b48a\x2d06a6\x2d4982\x2da4fe\x2db9673d5748eb.mount: Deactivated successfully. Feb 1 04:55:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:08.903 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:09 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:09.580 2 INFO neutron.agent.securitygroups_rpc [None req-efaae1c0-3e5f-4a79-9705-2ee5fc658831 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']#033[00m Feb 1 04:55:09 localhost nova_compute[274651]: 2026-02-01 09:55:09.664 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:10 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:10.355 2 INFO neutron.agent.securitygroups_rpc [None req-c53a02fb-f287-4ee9-90b1-fd2dea7e171b 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']#033[00m Feb 1 04:55:10 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:10.368 2 INFO neutron.agent.securitygroups_rpc [None req-8c9b29e3-84f5-4ecc-a69f-55a0bba3249d cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:10 localhost dnsmasq[309086]: exiting on receipt of SIGTERM Feb 1 04:55:10 localhost podman[309910]: 2026-02-01 09:55:10.807514402 +0000 UTC m=+0.060147445 container kill 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:55:10 localhost systemd[1]: libpod-639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63.scope: Deactivated successfully. Feb 1 04:55:10 localhost podman[309925]: 2026-02-01 09:55:10.884206818 +0000 UTC m=+0.061935342 container died 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:55:10 localhost podman[309925]: 2026-02-01 09:55:10.91477766 +0000 UTC m=+0.092506134 container cleanup 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:10 localhost systemd[1]: libpod-conmon-639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63.scope: Deactivated successfully. Feb 1 04:55:10 localhost podman[309927]: 2026-02-01 09:55:10.960077058 +0000 UTC m=+0.127029019 container remove 639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8c8faa8a-12c5-4564-a503-67f1cd5faeef, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:55:11 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:11.303 2 INFO neutron.agent.securitygroups_rpc [None req-f752f535-7480-4edb-a5ff-a77295c4683e 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']#033[00m Feb 1 04:55:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:11.394 259320 INFO neutron.agent.dhcp.agent [None req-023893d8-aa3c-465a-b1f4-f008f256f433 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:11 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:11.625 2 INFO neutron.agent.securitygroups_rpc [None req-61a06c10-15de-40c3-8b7d-3148d6b4f873 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:11 localhost systemd[1]: var-lib-containers-storage-overlay-fa3020ea0f25d444609104a3dca12b8a143c33c20ea4de26517bfbf3244bb0f0-merged.mount: Deactivated successfully. Feb 1 04:55:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-639ebe717b6701a63a155cf1374149d261f3709d8d459546e79e7bf86ad52d63-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:11 localhost systemd[1]: run-netns-qdhcp\x2d8c8faa8a\x2d12c5\x2d4564\x2da503\x2d67f1cd5faeef.mount: Deactivated successfully. Feb 1 04:55:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:12.004 2 INFO neutron.agent.securitygroups_rpc [None req-c4d0e02a-d41e-4d38-a096-515e00cc05ca 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']#033[00m Feb 1 04:55:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:12.296 2 INFO neutron.agent.securitygroups_rpc [None req-7c1928f8-098c-4c19-bf11-b8c429ffbdda 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']#033[00m Feb 1 04:55:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:12.943 2 INFO neutron.agent.securitygroups_rpc [None req-28204c12-7df5-4c98-a342-64d5ab507d83 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']#033[00m Feb 1 04:55:13 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:13.242 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:13 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:13.811 2 INFO neutron.agent.securitygroups_rpc [None req-5b09233a-6601-4e83-a057-9181035eb7ab cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:13 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:13.889 2 INFO neutron.agent.securitygroups_rpc [None req-0b649a65-4805-4006-9abf-770c26af78b1 930a89cab3af43239942c71cee47dc19 904cc8942364443bb4c4a4017bb1e647 - - default default] Security group member updated ['4db01845-8230-4c8d-a3f4-5e942e576ef7']#033[00m Feb 1 04:55:14 localhost nova_compute[274651]: 2026-02-01 09:55:14.400 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:14 localhost nova_compute[274651]: 2026-02-01 09:55:14.666 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:14 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:14.798 2 INFO neutron.agent.securitygroups_rpc [None req-8cf1a02e-dfaa-4f95-a3f5-4d4a9a4c833f 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['be540add-f8ad-43d9-9aea-3a58bb289e01']#033[00m Feb 1 04:55:15 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:15.486 2 INFO neutron.agent.securitygroups_rpc [None req-018e5631-8c87-44c1-9046-0c9a678cac95 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:55:15 localhost podman[309954]: 2026-02-01 09:55:15.693024558 +0000 UTC m=+0.059444239 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, version=9.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-type=git, architecture=x86_64) Feb 1 04:55:15 localhost podman[309954]: 2026-02-01 09:55:15.703418538 +0000 UTC m=+0.069838209 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, release=1769056855, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter) Feb 1 04:55:15 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:55:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:17.781 2 INFO neutron.agent.securitygroups_rpc [None req-ab764583-db62-4331-9e90-bc94b6b4e26a cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:19 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:19.330 2 INFO neutron.agent.securitygroups_rpc [None req-fa6c5d49-c4dc-46a9-9459-bf711075cc0d 930a89cab3af43239942c71cee47dc19 904cc8942364443bb4c4a4017bb1e647 - - default default] Security group member updated ['4db01845-8230-4c8d-a3f4-5e942e576ef7']#033[00m Feb 1 04:55:19 localhost nova_compute[274651]: 2026-02-01 09:55:19.669 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:55:19 localhost nova_compute[274651]: 2026-02-01 09:55:19.671 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:55:19 localhost nova_compute[274651]: 2026-02-01 09:55:19.671 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:55:19 localhost nova_compute[274651]: 2026-02-01 09:55:19.672 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:55:19 localhost nova_compute[274651]: 2026-02-01 09:55:19.696 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:19 localhost nova_compute[274651]: 2026-02-01 09:55:19.697 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:55:20 localhost nova_compute[274651]: 2026-02-01 09:55:20.417 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:20 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:20.543 2 INFO neutron.agent.securitygroups_rpc [None req-6dfed5db-0b52-4f95-900a-39e3e0691fbf cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:21 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:21.274 2 INFO neutron.agent.securitygroups_rpc [None req-cbc03874-de2d-4db1-8aeb-b67049c1615b cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:22 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:22.090 2 INFO neutron.agent.securitygroups_rpc [None req-22955056-d501-42ef-9a2c-bf3d181d8fe4 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']#033[00m Feb 1 04:55:22 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:22.214 2 INFO neutron.agent.securitygroups_rpc [None req-68b28681-9762-4f24-92af-1fa7309650a4 edcc55a03c02426f897467232a84b22e eeec82e52999475da0fa4e4a4a8effbd - - default default] Security group rule updated ['150b315a-79ca-493c-98be-8b45107659c4']#033[00m Feb 1 04:55:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:55:22 localhost systemd[1]: tmp-crun.5cTiwP.mount: Deactivated successfully. Feb 1 04:55:22 localhost podman[309974]: 2026-02-01 09:55:22.727359685 +0000 UTC m=+0.088538094 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible) Feb 1 04:55:22 localhost podman[309974]: 2026-02-01 09:55:22.742306634 +0000 UTC m=+0.103485033 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:55:22 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:55:23 localhost podman[236886]: time="2026-02-01T09:55:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:55:23 localhost podman[236886]: @ - - [01/Feb/2026:09:55:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:55:24 localhost podman[236886]: @ - - [01/Feb/2026:09:55:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18830 "" "Go-http-client/1.1" Feb 1 04:55:24 localhost nova_compute[274651]: 2026-02-01 09:55:24.727 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:25 localhost ovn_controller[152492]: 2026-02-01T09:55:25Z|00164|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:55:25 localhost nova_compute[274651]: 2026-02-01 09:55:25.302 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e123 do_prune osdmap full prune enabled Feb 1 04:55:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e124 e124: 6 total, 6 up, 6 in Feb 1 04:55:25 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e124: 6 total, 6 up, 6 in Feb 1 04:55:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e124 do_prune osdmap full prune enabled Feb 1 04:55:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e125 e125: 6 total, 6 up, 6 in Feb 1 04:55:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e125: 6 total, 6 up, 6 in Feb 1 04:55:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e125 do_prune osdmap full prune enabled Feb 1 04:55:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e126 e126: 6 total, 6 up, 6 in Feb 1 04:55:28 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e126: 6 total, 6 up, 6 in Feb 1 04:55:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:29.620 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:29.622 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:55:29 localhost nova_compute[274651]: 2026-02-01 09:55:29.642 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:29 localhost nova_compute[274651]: 2026-02-01 09:55:29.730 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:29.981 259320 INFO neutron.agent.linux.ip_lib [None req-380137d9-5c9f-4d46-ac1d-e467f00bf5b4 - - - - - -] Device tap7ab0a57e-84 cannot be used as it has no MAC address#033[00m Feb 1 04:55:29 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:29.996 2 INFO neutron.agent.securitygroups_rpc [None req-83329cbf-bffb-48da-a04c-50fb44fb93a0 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:30 localhost nova_compute[274651]: 2026-02-01 09:55:30.005 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:30 localhost kernel: device tap7ab0a57e-84 entered promiscuous mode Feb 1 04:55:30 localhost NetworkManager[5964]: [1769939730.0139] manager: (tap7ab0a57e-84): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Feb 1 04:55:30 localhost nova_compute[274651]: 2026-02-01 09:55:30.014 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:30 localhost ovn_controller[152492]: 2026-02-01T09:55:30Z|00165|binding|INFO|Claiming lport 7ab0a57e-8450-4b2e-82a6-9848493eac66 for this chassis. Feb 1 04:55:30 localhost ovn_controller[152492]: 2026-02-01T09:55:30Z|00166|binding|INFO|7ab0a57e-8450-4b2e-82a6-9848493eac66: Claiming unknown Feb 1 04:55:30 localhost systemd-udevd[310003]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:30.038 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ab0a57e-8450-4b2e-82a6-9848493eac66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:30.040 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7ab0a57e-8450-4b2e-82a6-9848493eac66 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:55:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:30.042 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:30.043 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[67702693-8ebe-4bcc-8b51-3baf3b2d97f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost ovn_controller[152492]: 2026-02-01T09:55:30Z|00167|binding|INFO|Setting lport 7ab0a57e-8450-4b2e-82a6-9848493eac66 ovn-installed in OVS Feb 1 04:55:30 localhost ovn_controller[152492]: 2026-02-01T09:55:30Z|00168|binding|INFO|Setting lport 7ab0a57e-8450-4b2e-82a6-9848493eac66 up in Southbound Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost nova_compute[274651]: 2026-02-01 09:55:30.051 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost journal[217584]: ethtool ioctl error on tap7ab0a57e-84: No such device Feb 1 04:55:30 localhost nova_compute[274651]: 2026-02-01 09:55:30.083 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:30 localhost nova_compute[274651]: 2026-02-01 09:55:30.108 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e126 do_prune osdmap full prune enabled Feb 1 04:55:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e127 e127: 6 total, 6 up, 6 in Feb 1 04:55:30 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e127: 6 total, 6 up, 6 in Feb 1 04:55:30 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:30.825 2 INFO neutron.agent.securitygroups_rpc [None req-32692fbf-89c1-4916-95dd-247e36fdbe6a e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:30 localhost podman[310074]: Feb 1 04:55:30 localhost podman[310074]: 2026-02-01 09:55:30.905849063 +0000 UTC m=+0.072108199 container create 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:55:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:55:30 localhost systemd[1]: Started libpod-conmon-21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5.scope. Feb 1 04:55:30 localhost systemd[1]: Started libcrun container. Feb 1 04:55:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9c8bb798c277c6587ef543992e38d7a66c70ebc7ce08012c44622290ecc28ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:30 localhost podman[310074]: 2026-02-01 09:55:30.961819885 +0000 UTC m=+0.128078991 container init 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:30 localhost podman[310074]: 2026-02-01 09:55:30.866370898 +0000 UTC m=+0.032630034 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:30 localhost podman[310074]: 2026-02-01 09:55:30.969122269 +0000 UTC m=+0.135381385 container start 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:55:30 localhost dnsmasq[310101]: started, version 2.85 cachesize 150 Feb 1 04:55:30 localhost dnsmasq[310101]: DNS service limited to local subnets Feb 1 04:55:30 localhost dnsmasq[310101]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:30 localhost dnsmasq[310101]: warning: no upstream servers configured Feb 1 04:55:30 localhost dnsmasq-dhcp[310101]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:55:30 localhost dnsmasq[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:30 localhost dnsmasq-dhcp[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:30 localhost dnsmasq-dhcp[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:31 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:31.017 259320 INFO neutron.agent.dhcp.agent [None req-380137d9-5c9f-4d46-ac1d-e467f00bf5b4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:29Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=88b79e02-7247-4fc2-9533-728f7cbcd230, ip_allocation=immediate, mac_address=fa:16:3e:dd:f8:42, name=tempest-NetworksTestDHCPv6-1846137901, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['6aaec72f-e290-4201-8565-766cd00969aa'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:29Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1614, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:29Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:55:31 localhost podman[310088]: 2026-02-01 09:55:31.045111746 +0000 UTC m=+0.096896751 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:55:31 localhost podman[310088]: 2026-02-01 09:55:31.059464317 +0000 UTC m=+0.111249352 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:55:31 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:55:31 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:31.134 259320 INFO neutron.agent.dhcp.agent [None req-860eb147-fa1d-4f34-a0b7-b70d5f58bab8 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:31 localhost podman[310135]: 2026-02-01 09:55:31.196079969 +0000 UTC m=+0.042914600 container kill 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:55:31 localhost dnsmasq[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:55:31 localhost dnsmasq-dhcp[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:31 localhost dnsmasq-dhcp[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:31 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:31.355 259320 INFO neutron.agent.dhcp.agent [None req-3ad22c5c-8ee2-4288-a152-a3cd1b0576de - - - - - -] DHCP configuration for ports {'88b79e02-7247-4fc2-9533-728f7cbcd230'} is completed#033[00m Feb 1 04:55:31 localhost openstack_network_exporter[239441]: ERROR 09:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:55:31 localhost openstack_network_exporter[239441]: Feb 1 04:55:31 localhost openstack_network_exporter[239441]: ERROR 09:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:55:31 localhost openstack_network_exporter[239441]: Feb 1 04:55:31 localhost podman[310173]: 2026-02-01 09:55:31.517667241 +0000 UTC m=+0.061879184 container kill 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:31 localhost dnsmasq[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:31 localhost dnsmasq-dhcp[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:31 localhost dnsmasq-dhcp[310101]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e127 do_prune osdmap full prune enabled Feb 1 04:55:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e128 e128: 6 total, 6 up, 6 in Feb 1 04:55:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e128: 6 total, 6 up, 6 in Feb 1 04:55:32 localhost systemd[1]: tmp-crun.GnC4ii.mount: Deactivated successfully. Feb 1 04:55:32 localhost dnsmasq[310101]: exiting on receipt of SIGTERM Feb 1 04:55:32 localhost podman[310211]: 2026-02-01 09:55:32.33433607 +0000 UTC m=+0.072573024 container kill 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:55:32 localhost systemd[1]: libpod-21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5.scope: Deactivated successfully. Feb 1 04:55:32 localhost podman[310223]: 2026-02-01 09:55:32.395549041 +0000 UTC m=+0.045522481 container died 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:55:32 localhost podman[310223]: 2026-02-01 09:55:32.425634987 +0000 UTC m=+0.075608397 container cleanup 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:55:32 localhost systemd[1]: libpod-conmon-21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5.scope: Deactivated successfully. Feb 1 04:55:32 localhost podman[310225]: 2026-02-01 09:55:32.456192137 +0000 UTC m=+0.098712027 container remove 21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:55:32 localhost ovn_controller[152492]: 2026-02-01T09:55:32Z|00169|binding|INFO|Releasing lport 7ab0a57e-8450-4b2e-82a6-9848493eac66 from this chassis (sb_readonly=0) Feb 1 04:55:32 localhost ovn_controller[152492]: 2026-02-01T09:55:32Z|00170|binding|INFO|Setting lport 7ab0a57e-8450-4b2e-82a6-9848493eac66 down in Southbound Feb 1 04:55:32 localhost nova_compute[274651]: 2026-02-01 09:55:32.468 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:32 localhost kernel: device tap7ab0a57e-84 left promiscuous mode Feb 1 04:55:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:32.477 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ab0a57e-8450-4b2e-82a6-9848493eac66) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:32.478 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7ab0a57e-8450-4b2e-82a6-9848493eac66 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:55:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:32.480 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:32.480 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8bd1cb-e9a5-447e-ad44-e9a5a08074ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:32 localhost nova_compute[274651]: 2026-02-01 09:55:32.492 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:32 localhost systemd[1]: var-lib-containers-storage-overlay-c9c8bb798c277c6587ef543992e38d7a66c70ebc7ce08012c44622290ecc28ae-merged.mount: Deactivated successfully. Feb 1 04:55:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21bae02d54da3a9839628e0bfc77b5f82f5931add9b6d1ef575d8dadad3384c5-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:32 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:55:33 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:33.744 259320 INFO neutron.agent.linux.ip_lib [None req-b035cd47-85c6-44e3-a75a-da2527ed52e0 - - - - - -] Device tap14c342b0-1a cannot be used as it has no MAC address#033[00m Feb 1 04:55:33 localhost nova_compute[274651]: 2026-02-01 09:55:33.763 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:33 localhost kernel: device tap14c342b0-1a entered promiscuous mode Feb 1 04:55:33 localhost NetworkManager[5964]: [1769939733.7705] manager: (tap14c342b0-1a): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Feb 1 04:55:33 localhost ovn_controller[152492]: 2026-02-01T09:55:33Z|00171|binding|INFO|Claiming lport 14c342b0-1a14-4b13-af35-047c9630caf1 for this chassis. Feb 1 04:55:33 localhost ovn_controller[152492]: 2026-02-01T09:55:33Z|00172|binding|INFO|14c342b0-1a14-4b13-af35-047c9630caf1: Claiming unknown Feb 1 04:55:33 localhost nova_compute[274651]: 2026-02-01 09:55:33.773 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:33 localhost systemd-udevd[310265]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:33.781 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=14c342b0-1a14-4b13-af35-047c9630caf1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:33 localhost ovn_controller[152492]: 2026-02-01T09:55:33Z|00173|binding|INFO|Setting lport 14c342b0-1a14-4b13-af35-047c9630caf1 ovn-installed in OVS Feb 1 04:55:33 localhost ovn_controller[152492]: 2026-02-01T09:55:33Z|00174|binding|INFO|Setting lport 14c342b0-1a14-4b13-af35-047c9630caf1 up in Southbound Feb 1 04:55:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:33.783 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 14c342b0-1a14-4b13-af35-047c9630caf1 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:55:33 localhost nova_compute[274651]: 2026-02-01 09:55:33.784 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:33.784 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:33 localhost nova_compute[274651]: 2026-02-01 09:55:33.788 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:33.791 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[2797bcb7-30ca-49f5-a7e5-3fc8607cb539]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:55:33 localhost nova_compute[274651]: 2026-02-01 09:55:33.818 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:33 localhost nova_compute[274651]: 2026-02-01 09:55:33.850 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:33 localhost systemd[1]: tmp-crun.JFJo7s.mount: Deactivated successfully. Feb 1 04:55:33 localhost podman[310268]: 2026-02-01 09:55:33.883891529 +0000 UTC m=+0.079926739 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent) Feb 1 04:55:33 localhost nova_compute[274651]: 2026-02-01 09:55:33.890 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:33 localhost podman[310268]: 2026-02-01 09:55:33.891852633 +0000 UTC m=+0.087887843 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:55:33 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:55:33 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:33.923 2 INFO neutron.agent.securitygroups_rpc [None req-d273fe1f-00e9-4cf8-ba92-3b038868502e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:34 localhost podman[310336]: Feb 1 04:55:34 localhost podman[310336]: 2026-02-01 09:55:34.630111651 +0000 UTC m=+0.076314748 container create 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:55:34 localhost systemd[1]: Started libpod-conmon-372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3.scope. Feb 1 04:55:34 localhost systemd[1]: Started libcrun container. Feb 1 04:55:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c07b94d910b4153a9838359c7c1f09e3bc350efe926327e377e81249672b257e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:34 localhost podman[310336]: 2026-02-01 09:55:34.689998372 +0000 UTC m=+0.136201499 container init 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:34 localhost podman[310336]: 2026-02-01 09:55:34.696583885 +0000 UTC m=+0.142786982 container start 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:55:34 localhost podman[310336]: 2026-02-01 09:55:34.598655363 +0000 UTC m=+0.044858480 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:34 localhost dnsmasq[310390]: started, version 2.85 cachesize 150 Feb 1 04:55:34 localhost dnsmasq[310390]: DNS service limited to local subnets Feb 1 04:55:34 localhost dnsmasq[310390]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:34 localhost dnsmasq[310390]: warning: no upstream servers configured Feb 1 04:55:34 localhost dnsmasq[310390]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:34 localhost nova_compute[274651]: 2026-02-01 09:55:34.732 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:34 localhost nova_compute[274651]: 2026-02-01 09:55:34.736 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:34.751 259320 INFO neutron.agent.dhcp.agent [None req-b035cd47-85c6-44e3-a75a-da2527ed52e0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:33Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=11b8ced3-4392-4a58-8e0a-3064a39a3e33, ip_allocation=immediate, mac_address=fa:16:3e:aa:31:4d, name=tempest-NetworksTestDHCPv6-22574497, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['2683f7f2-2dd6-4430-9d9f-4f292f0ad90f'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:32Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1644, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:33Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:55:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:34.846 259320 INFO neutron.agent.dhcp.agent [None req-a7e287a2-ffd4-46a8-95ca-885cf01ec61e - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:34 localhost podman[310410]: 2026-02-01 09:55:34.876296453 +0000 UTC m=+0.036281467 container kill 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:34 localhost dnsmasq[310390]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:55:34 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:34.984 2 INFO neutron.agent.securitygroups_rpc [None req-cdd29e27-3485-4128-94e1-06734434ccf5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:34 localhost nova_compute[274651]: 2026-02-01 09:55:34.995 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:34 localhost ovn_controller[152492]: 2026-02-01T09:55:34Z|00175|binding|INFO|Releasing lport 14c342b0-1a14-4b13-af35-047c9630caf1 from this chassis (sb_readonly=0) Feb 1 04:55:34 localhost ovn_controller[152492]: 2026-02-01T09:55:34Z|00176|binding|INFO|Setting lport 14c342b0-1a14-4b13-af35-047c9630caf1 down in Southbound Feb 1 04:55:34 localhost kernel: device tap14c342b0-1a left promiscuous mode Feb 1 04:55:35 localhost nova_compute[274651]: 2026-02-01 09:55:35.014 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.022 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=14c342b0-1a14-4b13-af35-047c9630caf1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.023 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 14c342b0-1a14-4b13-af35-047c9630caf1 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.025 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.026 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac4ab6d-e60d-48b8-a5ed-cf159a353e09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.063 259320 INFO neutron.agent.dhcp.agent [None req-3096a038-785f-470a-aa65-e68d4097ec3e - - - - - -] DHCP configuration for ports {'11b8ced3-4392-4a58-8e0a-3064a39a3e33'} is completed#033[00m Feb 1 04:55:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:55:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:55:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:55:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:55:35 localhost dnsmasq[310390]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:35 localhost podman[310468]: 2026-02-01 09:55:35.160818843 +0000 UTC m=+0.050375340 container kill 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 1 04:55:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:55:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent [None req-b035cd47-85c6-44e3-a75a-da2527ed52e0 - - - - - -] Unable to reload_allocations dhcp for cba39058-6a05-4f77-add1-57334b728a66.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap14c342b0-1a not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap14c342b0-1a not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.187 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.194 259320 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 1 04:55:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.408 259320 INFO neutron.agent.dhcp.agent [None req-8aa551ff-bd0c-45a3-a55c-41cb8d378902 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.419 259320 INFO neutron.agent.dhcp.agent [-] Starting network 2e1250bd-5beb-49f3-a522-fdc3f21d998a dhcp configuration#033[00m Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.419 259320 INFO neutron.agent.dhcp.agent [-] Finished network 2e1250bd-5beb-49f3-a522-fdc3f21d998a dhcp configuration#033[00m Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.419 259320 INFO neutron.agent.dhcp.agent [-] Starting network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:55:35 localhost podman[310536]: 2026-02-01 09:55:35.576472668 +0000 UTC m=+0.054696433 container kill 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:35 localhost dnsmasq[310390]: exiting on receipt of SIGTERM Feb 1 04:55:35 localhost systemd[1]: libpod-372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3.scope: Deactivated successfully. Feb 1 04:55:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:35 localhost podman[310551]: 2026-02-01 09:55:35.660706859 +0000 UTC m=+0.067081404 container died 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:55:35 localhost systemd[1]: var-lib-containers-storage-overlay-c07b94d910b4153a9838359c7c1f09e3bc350efe926327e377e81249672b257e-merged.mount: Deactivated successfully. Feb 1 04:55:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:35 localhost podman[310551]: 2026-02-01 09:55:35.687805833 +0000 UTC m=+0.094180348 container cleanup 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:35 localhost systemd[1]: libpod-conmon-372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3.scope: Deactivated successfully. Feb 1 04:55:35 localhost podman[310552]: 2026-02-01 09:55:35.732964202 +0000 UTC m=+0.137115959 container remove 372d328bff9f95526bb098c6a115c03a120565c042279c20498e2fdb3dcde8f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:55:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:35.785 259320 INFO neutron.agent.linux.ip_lib [-] Device tap14c342b0-1a cannot be used as it has no MAC address#033[00m Feb 1 04:55:35 localhost nova_compute[274651]: 2026-02-01 09:55:35.841 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:35 localhost kernel: device tap14c342b0-1a entered promiscuous mode Feb 1 04:55:35 localhost NetworkManager[5964]: [1769939735.8513] manager: (tap14c342b0-1a): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Feb 1 04:55:35 localhost systemd-udevd[310267]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:35 localhost ovn_controller[152492]: 2026-02-01T09:55:35Z|00177|binding|INFO|Claiming lport 14c342b0-1a14-4b13-af35-047c9630caf1 for this chassis. Feb 1 04:55:35 localhost nova_compute[274651]: 2026-02-01 09:55:35.852 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:35 localhost ovn_controller[152492]: 2026-02-01T09:55:35Z|00178|binding|INFO|14c342b0-1a14-4b13-af35-047c9630caf1: Claiming unknown Feb 1 04:55:35 localhost ovn_controller[152492]: 2026-02-01T09:55:35Z|00179|binding|INFO|Setting lport 14c342b0-1a14-4b13-af35-047c9630caf1 ovn-installed in OVS Feb 1 04:55:35 localhost nova_compute[274651]: 2026-02-01 09:55:35.864 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:35 localhost ovn_controller[152492]: 2026-02-01T09:55:35Z|00180|binding|INFO|Setting lport 14c342b0-1a14-4b13-af35-047c9630caf1 up in Southbound Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.866 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=14c342b0-1a14-4b13-af35-047c9630caf1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.867 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 14c342b0-1a14-4b13-af35-047c9630caf1 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.868 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:35.869 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[be4bb240-3c7d-42ca-98ca-d5af5162689e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:35 localhost nova_compute[274651]: 2026-02-01 09:55:35.888 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:35 localhost nova_compute[274651]: 2026-02-01 09:55:35.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:35 localhost nova_compute[274651]: 2026-02-01 09:55:35.948 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:55:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:36 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:36.304 2 INFO neutron.agent.securitygroups_rpc [None req-0cc22e1e-f882-4556-8673-0cf2b002c102 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:36 localhost nova_compute[274651]: 2026-02-01 09:55:36.494 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e128 do_prune osdmap full prune enabled Feb 1 04:55:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e129 e129: 6 total, 6 up, 6 in Feb 1 04:55:36 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e129: 6 total, 6 up, 6 in Feb 1 04:55:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:55:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:36 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:55:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:55:36 localhost podman[310684]: Feb 1 04:55:36 localhost podman[310684]: 2026-02-01 09:55:36.664697049 +0000 UTC m=+0.092258949 container create a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:36 localhost systemd[1]: Started libpod-conmon-a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7.scope. Feb 1 04:55:36 localhost systemd[1]: tmp-crun.ngeUYX.mount: Deactivated successfully. Feb 1 04:55:36 localhost podman[310684]: 2026-02-01 09:55:36.624580665 +0000 UTC m=+0.052142615 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:36 localhost systemd[1]: Started libcrun container. Feb 1 04:55:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ecf4ae779f4e79740f5a9ffff09737eaee8137b5f9f5e300f85460f598e5d81d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:36 localhost podman[310684]: 2026-02-01 09:55:36.740940574 +0000 UTC m=+0.168502474 container init a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:55:36 localhost podman[310684]: 2026-02-01 09:55:36.749197817 +0000 UTC m=+0.176759747 container start a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:55:36 localhost dnsmasq[310702]: started, version 2.85 cachesize 150 Feb 1 04:55:36 localhost dnsmasq[310702]: DNS service limited to local subnets Feb 1 04:55:36 localhost dnsmasq[310702]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:36 localhost dnsmasq[310702]: warning: no upstream servers configured Feb 1 04:55:36 localhost dnsmasq[310702]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:36.806 259320 INFO neutron.agent.dhcp.agent [-] Finished network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:55:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:36.806 259320 INFO neutron.agent.dhcp.agent [None req-8aa551ff-bd0c-45a3-a55c-41cb8d378902 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:55:37 localhost dnsmasq[310702]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:37 localhost podman[310721]: 2026-02-01 09:55:37.176341006 +0000 UTC m=+0.069945933 container kill a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:55:37 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:37.179 259320 INFO neutron.agent.dhcp.agent [None req-29b95b2a-3aa5-4678-a265-8f6f8a16d7a9 - - - - - -] DHCP configuration for ports {'14c342b0-1a14-4b13-af35-047c9630caf1', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:37.281 2 INFO neutron.agent.securitygroups_rpc [None req-9031f569-eff7-411d-8454-e1e2bf358206 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:37 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:37.624 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:55:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:37.643 2 INFO neutron.agent.securitygroups_rpc [None req-c13e4698-12de-4fa4-84de-f7194e33c853 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:37 localhost dnsmasq[310702]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:37 localhost podman[310761]: 2026-02-01 09:55:37.648934181 +0000 UTC m=+0.052970030 container kill a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:37 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:37.652 259320 INFO neutron.agent.dhcp.agent [None req-ea005cf0-ec44-4a13-ae20-735ca258946d - - - - - -] DHCP configuration for ports {'14c342b0-1a14-4b13-af35-047c9630caf1', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:37 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:37.773 259320 INFO neutron.agent.dhcp.agent [None req-914fe561-a942-45e3-89fd-bea5a3ccd42f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:37Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a70b9faf-a222-4a2a-89be-39c6f05be50a, ip_allocation=immediate, mac_address=fa:16:3e:ec:3a:a5, name=tempest-NetworksTestDHCPv6-1055664353, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=6, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['1fbd45ca-f82c-46a0-99cd-4a619e55325d'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:36Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1696, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:37Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:55:37 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:37.887 259320 INFO neutron.agent.linux.ip_lib [None req-012477e6-45b6-454c-a9d5-7bc21e85f33c - - - - - -] Device tapf733571a-0f cannot be used as it has no MAC address#033[00m Feb 1 04:55:37 localhost nova_compute[274651]: 2026-02-01 09:55:37.912 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:37 localhost kernel: device tapf733571a-0f entered promiscuous mode Feb 1 04:55:37 localhost NetworkManager[5964]: [1769939737.9194] manager: (tapf733571a-0f): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Feb 1 04:55:37 localhost ovn_controller[152492]: 2026-02-01T09:55:37Z|00181|binding|INFO|Claiming lport f733571a-0fa4-4847-b42a-2c5a85bd0fed for this chassis. Feb 1 04:55:37 localhost nova_compute[274651]: 2026-02-01 09:55:37.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:37 localhost ovn_controller[152492]: 2026-02-01T09:55:37Z|00182|binding|INFO|f733571a-0fa4-4847-b42a-2c5a85bd0fed: Claiming unknown Feb 1 04:55:37 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:37.938 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-2e1250bd-5beb-49f3-a522-fdc3f21d998a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e1250bd-5beb-49f3-a522-fdc3f21d998a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6419fd8b712b467ea6e03df22d411fcf', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a14101d-16e6-4045-94e9-474da6ab8d50, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=f733571a-0fa4-4847-b42a-2c5a85bd0fed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:37 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:37.940 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f733571a-0fa4-4847-b42a-2c5a85bd0fed in datapath 2e1250bd-5beb-49f3-a522-fdc3f21d998a bound to our chassis#033[00m Feb 1 04:55:37 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:37.941 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2e1250bd-5beb-49f3-a522-fdc3f21d998a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:37 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:37.942 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc5fb24-8ed5-4928-9ff6-f7922c94889b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:37 localhost nova_compute[274651]: 2026-02-01 09:55:37.944 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:55:37 localhost ovn_controller[152492]: 2026-02-01T09:55:37Z|00183|binding|INFO|Setting lport f733571a-0fa4-4847-b42a-2c5a85bd0fed ovn-installed in OVS Feb 1 04:55:37 localhost ovn_controller[152492]: 2026-02-01T09:55:37Z|00184|binding|INFO|Setting lport f733571a-0fa4-4847-b42a-2c5a85bd0fed up in Southbound Feb 1 04:55:37 localhost nova_compute[274651]: 2026-02-01 09:55:37.970 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:38 localhost dnsmasq[310702]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:55:38 localhost nova_compute[274651]: 2026-02-01 09:55:38.004 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:38 localhost podman[310810]: 2026-02-01 09:55:38.006237771 +0000 UTC m=+0.089187185 container kill a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:55:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:38.028 259320 INFO neutron.agent.dhcp.agent [None req-aae0f171-2cbb-423a-b018-7d939a08d6e4 - - - - - -] DHCP configuration for ports {'14c342b0-1a14-4b13-af35-047c9630caf1', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:38 localhost nova_compute[274651]: 2026-02-01 09:55:38.040 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:38 localhost podman[310822]: 2026-02-01 09:55:38.057644902 +0000 UTC m=+0.092237048 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:55:38 localhost podman[310822]: 2026-02-01 09:55:38.09335591 +0000 UTC m=+0.127948046 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:55:38 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:55:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:38.267 259320 INFO neutron.agent.dhcp.agent [None req-28296cbc-e765-446b-a4bd-9b82eb05b5c5 - - - - - -] DHCP configuration for ports {'a70b9faf-a222-4a2a-89be-39c6f05be50a'} is completed#033[00m Feb 1 04:55:38 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:38.380 2 INFO neutron.agent.securitygroups_rpc [None req-c43d2a56-e16d-478a-abbd-9e2bb456c208 d96cff636365480c93dc8d1f3e16c531 272972c8d99e4a5c99e73e4bdb72346d - - default default] Security group rule updated ['56a3691b-0dfa-477a-aaac-6fc6d2066735']#033[00m Feb 1 04:55:38 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:38.532 2 INFO neutron.agent.securitygroups_rpc [None req-e32d5a82-c050-454c-9bc9-fb7e97cffc23 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:55:38 localhost systemd[1]: tmp-crun.wbVurh.mount: Deactivated successfully. Feb 1 04:55:38 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:38.671 2 INFO neutron.agent.securitygroups_rpc [None req-b42ac328-eb3f-4f32-8c68-ad479176a68e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:38 localhost podman[310888]: 2026-02-01 09:55:38.702974631 +0000 UTC m=+0.066201548 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:55:38 localhost podman[310888]: 2026-02-01 09:55:38.786611363 +0000 UTC m=+0.149838240 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 1 04:55:38 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:55:38 localhost podman[310949]: Feb 1 04:55:38 localhost podman[310949]: 2026-02-01 09:55:38.830608946 +0000 UTC m=+0.062275586 container create 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:38 localhost systemd[1]: Started libpod-conmon-023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09.scope. Feb 1 04:55:38 localhost systemd[1]: Started libcrun container. Feb 1 04:55:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e60b2c502ae3d187eeefe6d6b689d8381d3526a09dd20e18ffc2955248f020ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:38 localhost podman[310949]: 2026-02-01 09:55:38.887687492 +0000 UTC m=+0.119354122 container init 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:38 localhost dnsmasq[310702]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:38 localhost podman[310960]: 2026-02-01 09:55:38.89187192 +0000 UTC m=+0.098233812 container kill a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:55:38 localhost podman[310949]: 2026-02-01 09:55:38.898505485 +0000 UTC m=+0.130172125 container start 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:38 localhost podman[310949]: 2026-02-01 09:55:38.803133762 +0000 UTC m=+0.034800442 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:38 localhost dnsmasq[310983]: started, version 2.85 cachesize 150 Feb 1 04:55:38 localhost dnsmasq[310983]: DNS service limited to local subnets Feb 1 04:55:38 localhost dnsmasq[310983]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:38 localhost dnsmasq[310983]: warning: no upstream servers configured Feb 1 04:55:38 localhost dnsmasq-dhcp[310983]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:55:38 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 1 addresses Feb 1 04:55:38 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:38 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:39.258 259320 INFO neutron.agent.dhcp.agent [None req-c9216e7b-3960-4fbb-b145-5452f7947a89 - - - - - -] DHCP configuration for ports {'159944a0-107a-44b5-9f7a-6cd59423b2be', '748f38c2-de90-4d69-a1fa-be9fbb5e49a8'} is completed#033[00m Feb 1 04:55:39 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 1 addresses Feb 1 04:55:39 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:39 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:39 localhost podman[311010]: 2026-02-01 09:55:39.456555449 +0000 UTC m=+0.054683344 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:39 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:39.495 2 INFO neutron.agent.securitygroups_rpc [None req-89d33254-82ae-4671-bea8-643bd9d50212 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:39.582 259320 INFO neutron.agent.dhcp.agent [None req-05054ea8-475b-46e9-902d-7c8f2e4ad9c9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:36Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=748f38c2-de90-4d69-a1fa-be9fbb5e49a8, ip_allocation=immediate, mac_address=fa:16:3e:33:88:81, name=tempest-AllowedAddressPairIpV6TestJSON-1557609490, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:34Z, description=, dns_domain=, id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-584258435, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1666, status=ACTIVE, subnets=['79a98b59-9cef-4a21-ade2-0338fd52089f'], tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:35Z, vlan_transparent=None, network_id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4a3b0332-f824-4e4c-b1eb-cf09581851da'], standard_attr_id=1678, status=DOWN, tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:36Z on network 2e1250bd-5beb-49f3-a522-fdc3f21d998a#033[00m Feb 1 04:55:39 localhost dnsmasq[310702]: exiting on receipt of SIGTERM Feb 1 04:55:39 localhost podman[311057]: 2026-02-01 09:55:39.724786929 +0000 UTC m=+0.074313157 container kill a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:39 localhost systemd[1]: tmp-crun.b40xya.mount: Deactivated successfully. Feb 1 04:55:39 localhost systemd[1]: libpod-a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7.scope: Deactivated successfully. Feb 1 04:55:39 localhost nova_compute[274651]: 2026-02-01 09:55:39.771 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:39 localhost podman[311084]: 2026-02-01 09:55:39.789567791 +0000 UTC m=+0.051872817 container died a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:39.810 259320 INFO neutron.agent.dhcp.agent [None req-b840133a-1fc0-4fa8-8987-9ae32743ec64 - - - - - -] DHCP configuration for ports {'f733571a-0fa4-4847-b42a-2c5a85bd0fed', '159944a0-107a-44b5-9f7a-6cd59423b2be', '748f38c2-de90-4d69-a1fa-be9fbb5e49a8'} is completed#033[00m Feb 1 04:55:39 localhost podman[311084]: 2026-02-01 09:55:39.82464782 +0000 UTC m=+0.086952846 container cleanup a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:39 localhost systemd[1]: libpod-conmon-a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7.scope: Deactivated successfully. Feb 1 04:55:39 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 1 addresses Feb 1 04:55:39 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:39 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:39 localhost podman[311075]: 2026-02-01 09:55:39.853845958 +0000 UTC m=+0.141898135 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:39 localhost podman[311089]: 2026-02-01 09:55:39.914327398 +0000 UTC m=+0.166750169 container remove a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:55:39 localhost ovn_controller[152492]: 2026-02-01T09:55:39Z|00185|binding|INFO|Releasing lport 14c342b0-1a14-4b13-af35-047c9630caf1 from this chassis (sb_readonly=0) Feb 1 04:55:39 localhost kernel: device tap14c342b0-1a left promiscuous mode Feb 1 04:55:39 localhost ovn_controller[152492]: 2026-02-01T09:55:39Z|00186|binding|INFO|Setting lport 14c342b0-1a14-4b13-af35-047c9630caf1 down in Southbound Feb 1 04:55:39 localhost nova_compute[274651]: 2026-02-01 09:55:39.927 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:39 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:39.933 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=14c342b0-1a14-4b13-af35-047c9630caf1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:39 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:39.935 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 14c342b0-1a14-4b13-af35-047c9630caf1 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:55:39 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:39.936 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:39 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:39.936 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[5e39a07e-2eaf-43d4-aea3-bb5294580ed8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:39 localhost nova_compute[274651]: 2026-02-01 09:55:39.942 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:39.972 259320 INFO neutron.agent.dhcp.agent [None req-05054ea8-475b-46e9-902d-7c8f2e4ad9c9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:36Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=0a2b2ca2-2b40-4601-8b8e-2c3d1a395f03, ip_allocation=immediate, mac_address=fa:16:3e:46:ce:40, name=tempest-AllowedAddressPairIpV6TestJSON-1762914005, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:34Z, description=, dns_domain=, id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-584258435, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1666, status=ACTIVE, subnets=['79a98b59-9cef-4a21-ade2-0338fd52089f'], tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:35Z, vlan_transparent=None, network_id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4a3b0332-f824-4e4c-b1eb-cf09581851da'], standard_attr_id=1693, status=DOWN, tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:37Z on network 2e1250bd-5beb-49f3-a522-fdc3f21d998a#033[00m Feb 1 04:55:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:40.111 259320 INFO neutron.agent.dhcp.agent [None req-a95a2fd9-7758-4b39-8f6b-227120f23474 - - - - - -] DHCP configuration for ports {'748f38c2-de90-4d69-a1fa-be9fbb5e49a8'} is completed#033[00m Feb 1 04:55:40 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 2 addresses Feb 1 04:55:40 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:40 localhost podman[311148]: 2026-02-01 09:55:40.121063757 +0000 UTC m=+0.045325925 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:40 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:40.180 259320 INFO neutron.agent.dhcp.agent [None req-57401273-6169-4d74-a38b-16b1151b26d3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:40.249 259320 INFO neutron.agent.dhcp.agent [None req-05054ea8-475b-46e9-902d-7c8f2e4ad9c9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:39Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=746a9476-ee86-4340-9c00-ad4aa72616e2, ip_allocation=immediate, mac_address=fa:16:3e:6f:20:2f, name=tempest-AllowedAddressPairIpV6TestJSON-1133633182, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:34Z, description=, dns_domain=, id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-584258435, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1666, status=ACTIVE, subnets=['79a98b59-9cef-4a21-ade2-0338fd52089f'], tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:35Z, vlan_transparent=None, network_id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4a3b0332-f824-4e4c-b1eb-cf09581851da'], standard_attr_id=1705, status=DOWN, tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:39Z on network 2e1250bd-5beb-49f3-a522-fdc3f21d998a#033[00m Feb 1 04:55:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:40.332 259320 INFO neutron.agent.dhcp.agent [None req-b7007bb0-91bd-427e-a846-e480bea498d3 - - - - - -] DHCP configuration for ports {'0a2b2ca2-2b40-4601-8b8e-2c3d1a395f03'} is completed#033[00m Feb 1 04:55:40 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 3 addresses Feb 1 04:55:40 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:40 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:40 localhost podman[311186]: 2026-02-01 09:55:40.389694789 +0000 UTC m=+0.050632738 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 04:55:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:40.608 259320 INFO neutron.agent.dhcp.agent [None req-c4030e95-9822-4302-b8ee-7351a7565657 - - - - - -] DHCP configuration for ports {'746a9476-ee86-4340-9c00-ad4aa72616e2'} is completed#033[00m Feb 1 04:55:40 localhost systemd[1]: tmp-crun.qzhixN.mount: Deactivated successfully. Feb 1 04:55:40 localhost systemd[1]: var-lib-containers-storage-overlay-ecf4ae779f4e79740f5a9ffff09737eaee8137b5f9f5e300f85460f598e5d81d-merged.mount: Deactivated successfully. Feb 1 04:55:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a77f6dda3befb9c46da003db6edf8e3735515317acde467f15dc5446ba6a2eb7-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:40 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:55:40 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 2 addresses Feb 1 04:55:40 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:40 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:40 localhost podman[311226]: 2026-02-01 09:55:40.716163101 +0000 UTC m=+0.064122453 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:55:40 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:40.951 2 INFO neutron.agent.securitygroups_rpc [None req-f70d179e-59ca-44a3-8018-26f7c96f2b8c 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:41 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 1 addresses Feb 1 04:55:41 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:41 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:41 localhost podman[311266]: 2026-02-01 09:55:41.175028884 +0000 UTC m=+0.060569124 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:55:41 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:41.399 259320 INFO neutron.agent.linux.ip_lib [None req-283ecaf1-c7a7-414a-a9a0-f35dafd72f7e - - - - - -] Device tap2ab368ce-ec cannot be used as it has no MAC address#033[00m Feb 1 04:55:41 localhost nova_compute[274651]: 2026-02-01 09:55:41.457 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:41 localhost kernel: device tap2ab368ce-ec entered promiscuous mode Feb 1 04:55:41 localhost NetworkManager[5964]: [1769939741.4656] manager: (tap2ab368ce-ec): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Feb 1 04:55:41 localhost nova_compute[274651]: 2026-02-01 09:55:41.465 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:41 localhost ovn_controller[152492]: 2026-02-01T09:55:41Z|00187|binding|INFO|Claiming lport 2ab368ce-ec7c-4aa2-b6e7-d7cc66062583 for this chassis. Feb 1 04:55:41 localhost ovn_controller[152492]: 2026-02-01T09:55:41Z|00188|binding|INFO|2ab368ce-ec7c-4aa2-b6e7-d7cc66062583: Claiming unknown Feb 1 04:55:41 localhost systemd-udevd[311298]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:41 localhost ovn_controller[152492]: 2026-02-01T09:55:41Z|00189|binding|INFO|Setting lport 2ab368ce-ec7c-4aa2-b6e7-d7cc66062583 ovn-installed in OVS Feb 1 04:55:41 localhost nova_compute[274651]: 2026-02-01 09:55:41.476 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:41 localhost nova_compute[274651]: 2026-02-01 09:55:41.481 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:41 localhost ovn_controller[152492]: 2026-02-01T09:55:41Z|00190|binding|INFO|Setting lport 2ab368ce-ec7c-4aa2-b6e7-d7cc66062583 up in Southbound Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:41.492 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2ab368ce-ec7c-4aa2-b6e7-d7cc66062583) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:41.494 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 2ab368ce-ec7c-4aa2-b6e7-d7cc66062583 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:55:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:41.497 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:41.498 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[37fc53f2-37bd-4d5a-a8e1-9e7b4f5ff73b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost nova_compute[274651]: 2026-02-01 09:55:41.503 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost journal[217584]: ethtool ioctl error on tap2ab368ce-ec: No such device Feb 1 04:55:41 localhost nova_compute[274651]: 2026-02-01 09:55:41.534 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:41 localhost nova_compute[274651]: 2026-02-01 09:55:41.561 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:41 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:41.637 2 INFO neutron.agent.securitygroups_rpc [None req-3276d6b5-a480-4f28-8680-1993dd5ca124 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']#033[00m Feb 1 04:55:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:41.717 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:55:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:41.718 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:55:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:41.718 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:55:41 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:41.851 2 INFO neutron.agent.securitygroups_rpc [None req-739b236d-6306-45a2-92f5-3e504f993767 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:41 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:41.908 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:41Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=982acfed-c4b5-4d8c-8eb2-71219ee772f3, ip_allocation=immediate, mac_address=fa:16:3e:42:c8:52, name=tempest-AllowedAddressPairIpV6TestJSON-1462258018, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:34Z, description=, dns_domain=, id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-584258435, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1666, status=ACTIVE, subnets=['79a98b59-9cef-4a21-ade2-0338fd52089f'], tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:35Z, vlan_transparent=None, network_id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4a3b0332-f824-4e4c-b1eb-cf09581851da'], standard_attr_id=1723, status=DOWN, tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:41Z on network 2e1250bd-5beb-49f3-a522-fdc3f21d998a#033[00m Feb 1 04:55:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:42.038 2 INFO neutron.agent.securitygroups_rpc [None req-3c4d812b-8cb5-4a19-9709-fc562d1570e9 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:42 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 2 addresses Feb 1 04:55:42 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:42 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:42 localhost podman[311365]: 2026-02-01 09:55:42.09305566 +0000 UTC m=+0.044209601 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:55:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:42.284 259320 INFO neutron.agent.dhcp.agent [None req-a14f4d57-d3e2-4775-b383-3d63710c26d4 - - - - - -] DHCP configuration for ports {'982acfed-c4b5-4d8c-8eb2-71219ee772f3'} is completed#033[00m Feb 1 04:55:42 localhost podman[311409]: Feb 1 04:55:42 localhost podman[311409]: 2026-02-01 09:55:42.371412351 +0000 UTC m=+0.079497216 container create 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 04:55:42 localhost systemd[1]: Started libpod-conmon-8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f.scope. Feb 1 04:55:42 localhost podman[311409]: 2026-02-01 09:55:42.327540052 +0000 UTC m=+0.035624947 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:42 localhost systemd[1]: Started libcrun container. Feb 1 04:55:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cc7d4692da300dcd61325f394a7c975e37bab2857164601bf300244d2838a203/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:42 localhost podman[311409]: 2026-02-01 09:55:42.449131612 +0000 UTC m=+0.157216447 container init 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:42 localhost podman[311409]: 2026-02-01 09:55:42.455495947 +0000 UTC m=+0.163580792 container start 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:55:42 localhost dnsmasq[311428]: started, version 2.85 cachesize 150 Feb 1 04:55:42 localhost dnsmasq[311428]: DNS service limited to local subnets Feb 1 04:55:42 localhost dnsmasq[311428]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:42 localhost dnsmasq[311428]: warning: no upstream servers configured Feb 1 04:55:42 localhost dnsmasq-dhcp[311428]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:55:42 localhost dnsmasq[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:42 localhost dnsmasq-dhcp[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:42 localhost dnsmasq-dhcp[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:42.517 259320 INFO neutron.agent.dhcp.agent [None req-283ecaf1-c7a7-414a-a9a0-f35dafd72f7e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:41Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e17bb609-c438-4096-8753-602545923839, ip_allocation=immediate, mac_address=fa:16:3e:5a:97:0c, name=tempest-NetworksTestDHCPv6-1507716077, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['b1041191-b329-489f-a6a2-f5fd8196bd3a'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:40Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1722, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:41Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:55:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:42.633 259320 INFO neutron.agent.dhcp.agent [None req-69d8213b-89f1-439b-a35c-861ed6630a86 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:42 localhost dnsmasq[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:55:42 localhost podman[311445]: 2026-02-01 09:55:42.668797729 +0000 UTC m=+0.054034384 container kill 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:42 localhost dnsmasq-dhcp[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:42 localhost dnsmasq-dhcp[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:42.887 2 INFO neutron.agent.securitygroups_rpc [None req-201a996d-ad5d-4320-ba39-e8954e21d5dc 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']#033[00m Feb 1 04:55:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:42.914 259320 INFO neutron.agent.dhcp.agent [None req-8c6366e2-5017-4965-94e0-57c42358175d - - - - - -] DHCP configuration for ports {'e17bb609-c438-4096-8753-602545923839'} is completed#033[00m Feb 1 04:55:43 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:43.379 2 INFO neutron.agent.securitygroups_rpc [None req-37d62ac1-fed3-4baa-9fc5-73880f6c1760 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:43 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:43.443 2 INFO neutron.agent.securitygroups_rpc [None req-3fe18ee3-04ff-4a52-96de-1f4dd720f476 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:43 localhost systemd[1]: tmp-crun.fh9o9b.mount: Deactivated successfully. Feb 1 04:55:43 localhost dnsmasq[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:43 localhost podman[311488]: 2026-02-01 09:55:43.649483952 +0000 UTC m=+0.119649602 container kill 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:55:43 localhost dnsmasq-dhcp[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:43 localhost dnsmasq-dhcp[311428]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:43 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 1 addresses Feb 1 04:55:43 localhost podman[311513]: 2026-02-01 09:55:43.671787278 +0000 UTC m=+0.055223380 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:55:43 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:43 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:43 localhost nova_compute[274651]: 2026-02-01 09:55:43.788 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:43 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:43.906 2 INFO neutron.agent.securitygroups_rpc [None req-14278f14-d261-4185-9918-5bcd6670f17f 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']#033[00m Feb 1 04:55:44 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:44.165 2 INFO neutron.agent.securitygroups_rpc [None req-edf8cab4-6686-4ad5-95d6-283face1fc91 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:44.237 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:43Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=36fb629b-d3e3-48dc-9089-93aa1cd4599c, ip_allocation=immediate, mac_address=fa:16:3e:f2:37:11, name=tempest-AllowedAddressPairIpV6TestJSON-479862114, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:34Z, description=, dns_domain=, id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-584258435, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1666, status=ACTIVE, subnets=['79a98b59-9cef-4a21-ade2-0338fd52089f'], tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:35Z, vlan_transparent=None, network_id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4a3b0332-f824-4e4c-b1eb-cf09581851da'], standard_attr_id=1746, status=DOWN, tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:43Z on network 2e1250bd-5beb-49f3-a522-fdc3f21d998a#033[00m Feb 1 04:55:44 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 2 addresses Feb 1 04:55:44 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:44 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:44 localhost podman[311578]: 2026-02-01 09:55:44.400353736 +0000 UTC m=+0.040826387 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:55:44 localhost podman[311587]: 2026-02-01 09:55:44.414757909 +0000 UTC m=+0.044342375 container kill 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:55:44 localhost dnsmasq[311428]: exiting on receipt of SIGTERM Feb 1 04:55:44 localhost systemd[1]: tmp-crun.rYyGVB.mount: Deactivated successfully. Feb 1 04:55:44 localhost systemd[1]: libpod-8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f.scope: Deactivated successfully. Feb 1 04:55:44 localhost podman[311610]: 2026-02-01 09:55:44.468269555 +0000 UTC m=+0.035556345 container died 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:55:44 localhost podman[311610]: 2026-02-01 09:55:44.501370383 +0000 UTC m=+0.068657123 container remove 8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:44 localhost ovn_controller[152492]: 2026-02-01T09:55:44Z|00191|binding|INFO|Releasing lport 2ab368ce-ec7c-4aa2-b6e7-d7cc66062583 from this chassis (sb_readonly=0) Feb 1 04:55:44 localhost ovn_controller[152492]: 2026-02-01T09:55:44Z|00192|binding|INFO|Setting lport 2ab368ce-ec7c-4aa2-b6e7-d7cc66062583 down in Southbound Feb 1 04:55:44 localhost kernel: device tap2ab368ce-ec left promiscuous mode Feb 1 04:55:44 localhost nova_compute[274651]: 2026-02-01 09:55:44.565 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:44 localhost systemd[1]: libpod-conmon-8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f.scope: Deactivated successfully. Feb 1 04:55:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:44.573 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2ab368ce-ec7c-4aa2-b6e7-d7cc66062583) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:44.574 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 2ab368ce-ec7c-4aa2-b6e7-d7cc66062583 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:55:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:44.575 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:44.575 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[726e853a-d0e6-45af-8cd4-849b4bae0222]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:44 localhost nova_compute[274651]: 2026-02-01 09:55:44.580 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:44.660 259320 INFO neutron.agent.dhcp.agent [None req-70ebb863-4067-47ed-9813-137cea986deb - - - - - -] DHCP configuration for ports {'36fb629b-d3e3-48dc-9089-93aa1cd4599c'} is completed#033[00m Feb 1 04:55:44 localhost nova_compute[274651]: 2026-02-01 09:55:44.775 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:44 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:44.806 2 INFO neutron.agent.securitygroups_rpc [None req-6cb6b6f1-5b7d-4172-a864-3d0396afa917 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:44.865 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:44Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=314a620c-df7f-4896-be5b-48add962ea75, ip_allocation=immediate, mac_address=fa:16:3e:91:43:15, name=tempest-AllowedAddressPairIpV6TestJSON-133938525, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:34Z, description=, dns_domain=, id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairIpV6TestJSON-test-network-584258435, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1666, status=ACTIVE, subnets=['79a98b59-9cef-4a21-ade2-0338fd52089f'], tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:35Z, vlan_transparent=None, network_id=2e1250bd-5beb-49f3-a522-fdc3f21d998a, port_security_enabled=True, project_id=6419fd8b712b467ea6e03df22d411fcf, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4a3b0332-f824-4e4c-b1eb-cf09581851da'], standard_attr_id=1750, status=DOWN, tags=[], tenant_id=6419fd8b712b467ea6e03df22d411fcf, updated_at=2026-02-01T09:55:44Z on network 2e1250bd-5beb-49f3-a522-fdc3f21d998a#033[00m Feb 1 04:55:45 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:45.040 2 INFO neutron.agent.securitygroups_rpc [None req-889331f7-1b8f-45c7-9568-2acc0f065d63 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']#033[00m Feb 1 04:55:45 localhost podman[311657]: 2026-02-01 09:55:45.05951315 +0000 UTC m=+0.057129378 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:55:45 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 3 addresses Feb 1 04:55:45 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:45 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:45 localhost systemd[1]: var-lib-containers-storage-overlay-cc7d4692da300dcd61325f394a7c975e37bab2857164601bf300244d2838a203-merged.mount: Deactivated successfully. Feb 1 04:55:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8effbee8e603ead8f7f712b87e548da5aa9dd775c3ca0b3a9e77df19a060fa4f-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:45 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:55:45 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:45.339 259320 INFO neutron.agent.dhcp.agent [None req-3ac45da9-48b7-43dd-9012-a34de29c177a - - - - - -] DHCP configuration for ports {'314a620c-df7f-4896-be5b-48add962ea75'} is completed#033[00m Feb 1 04:55:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:55:45 localhost podman[311679]: 2026-02-01 09:55:45.906337846 +0000 UTC m=+0.048728240 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter) Feb 1 04:55:45 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:45.914 259320 INFO neutron.agent.linux.ip_lib [None req-86bcecf3-c7d0-4598-b417-93207d650b4d - - - - - -] Device tap7b25a07f-13 cannot be used as it has no MAC address#033[00m Feb 1 04:55:45 localhost podman[311679]: 2026-02-01 09:55:45.942336633 +0000 UTC m=+0.084727017 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:55:45 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:55:45 localhost nova_compute[274651]: 2026-02-01 09:55:45.966 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:45 localhost kernel: device tap7b25a07f-13 entered promiscuous mode Feb 1 04:55:45 localhost NetworkManager[5964]: [1769939745.9726] manager: (tap7b25a07f-13): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Feb 1 04:55:45 localhost ovn_controller[152492]: 2026-02-01T09:55:45Z|00193|binding|INFO|Claiming lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e for this chassis. Feb 1 04:55:45 localhost nova_compute[274651]: 2026-02-01 09:55:45.973 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:45 localhost ovn_controller[152492]: 2026-02-01T09:55:45Z|00194|binding|INFO|7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e: Claiming unknown Feb 1 04:55:45 localhost systemd-udevd[311707]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:45 localhost ovn_controller[152492]: 2026-02-01T09:55:45Z|00195|binding|INFO|Setting lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e ovn-installed in OVS Feb 1 04:55:45 localhost nova_compute[274651]: 2026-02-01 09:55:45.979 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:45 localhost nova_compute[274651]: 2026-02-01 09:55:45.980 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:45 localhost ovn_controller[152492]: 2026-02-01T09:55:45Z|00196|binding|INFO|Setting lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e up in Southbound Feb 1 04:55:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:45.995 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:45.996 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:55:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:45.997 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:45.998 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[1d9d6870-99c6-42e9-96b8-22c43dad8acc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:46 localhost nova_compute[274651]: 2026-02-01 09:55:46.011 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:46 localhost nova_compute[274651]: 2026-02-01 09:55:46.042 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:46 localhost nova_compute[274651]: 2026-02-01 09:55:46.067 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:46 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:46.161 2 INFO neutron.agent.securitygroups_rpc [None req-04f416d8-8fa2-4799-adf0-ff612b0eb9e5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:46 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:46.311 2 INFO neutron.agent.securitygroups_rpc [None req-31a351f9-c21a-4778-8d23-ceef24052c50 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:46 localhost systemd[1]: tmp-crun.ATbVSD.mount: Deactivated successfully. Feb 1 04:55:46 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 2 addresses Feb 1 04:55:46 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:46 localhost podman[311758]: 2026-02-01 09:55:46.515604646 +0000 UTC m=+0.064993920 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:46 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:46 localhost podman[311800]: Feb 1 04:55:46 localhost podman[311800]: 2026-02-01 09:55:46.821558696 +0000 UTC m=+0.093955922 container create 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:55:46 localhost systemd[1]: Started libpod-conmon-8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7.scope. Feb 1 04:55:46 localhost systemd[1]: Started libcrun container. Feb 1 04:55:46 localhost podman[311800]: 2026-02-01 09:55:46.775700385 +0000 UTC m=+0.048097641 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9742d3985a4d1f674b0f9589c3a068cdd0a0c4f9f8add8334b09ab818ed5ffd4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:46 localhost podman[311800]: 2026-02-01 09:55:46.884852073 +0000 UTC m=+0.157249279 container init 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:55:46 localhost podman[311800]: 2026-02-01 09:55:46.894305533 +0000 UTC m=+0.166702739 container start 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:46 localhost dnsmasq[311818]: started, version 2.85 cachesize 150 Feb 1 04:55:46 localhost dnsmasq[311818]: DNS service limited to local subnets Feb 1 04:55:46 localhost dnsmasq[311818]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:46 localhost dnsmasq[311818]: warning: no upstream servers configured Feb 1 04:55:46 localhost dnsmasq[311818]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:46.951 259320 INFO neutron.agent.dhcp.agent [None req-86bcecf3-c7d0-4598-b417-93207d650b4d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:45Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4577ece8-1120-4dba-b725-aae07424f6ba, ip_allocation=immediate, mac_address=fa:16:3e:e1:45:ea, name=tempest-NetworksTestDHCPv6-719312461, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['d5b57ba4-0ee0-43ab-b2fb-a0148849996c'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:44Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1753, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:45Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:55:47 localhost dnsmasq[311818]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:55:47 localhost podman[311836]: 2026-02-01 09:55:47.108884783 +0000 UTC m=+0.056644994 container kill 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:47 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:47.129 259320 INFO neutron.agent.dhcp.agent [None req-22276e37-29a6-4570-b776-55f97013c13e - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:47 localhost ovn_controller[152492]: 2026-02-01T09:55:47Z|00197|binding|INFO|Releasing lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e from this chassis (sb_readonly=0) Feb 1 04:55:47 localhost nova_compute[274651]: 2026-02-01 09:55:47.396 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:47 localhost kernel: device tap7b25a07f-13 left promiscuous mode Feb 1 04:55:47 localhost ovn_controller[152492]: 2026-02-01T09:55:47Z|00198|binding|INFO|Setting lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e down in Southbound Feb 1 04:55:47 localhost nova_compute[274651]: 2026-02-01 09:55:47.416 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:47.416 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:47.417 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:55:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:47.418 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:47.418 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d9982a53-6dc5-440e-850e-e24b76c959ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:47 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:47.482 259320 INFO neutron.agent.dhcp.agent [None req-80f49945-e278-48f2-a257-e048eaa1075a - - - - - -] DHCP configuration for ports {'4577ece8-1120-4dba-b725-aae07424f6ba'} is completed#033[00m Feb 1 04:55:47 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:47.563 2 INFO neutron.agent.securitygroups_rpc [None req-33ff6921-4704-427a-80ac-43e95e4fc8cf 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:47 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 1 addresses Feb 1 04:55:47 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:47 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:47 localhost podman[311877]: 2026-02-01 09:55:47.754261323 +0000 UTC m=+0.057305194 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:47 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:47.855 2 INFO neutron.agent.securitygroups_rpc [None req-d0c90d7d-7396-4a27-b056-110260352268 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:48 localhost podman[311914]: 2026-02-01 09:55:48.045024576 +0000 UTC m=+0.058512741 container kill 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:55:48 localhost dnsmasq[311818]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for cba39058-6a05-4f77-add1-57334b728a66.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap7b25a07f-13 not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap7b25a07f-13 not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.069 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.073 259320 INFO neutron.agent.dhcp.agent [None req-8aa551ff-bd0c-45a3-a55c-41cb8d378902 - - - - - -] Synchronizing state#033[00m Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.540 259320 INFO neutron.agent.dhcp.agent [None req-2f6adfb1-1bda-422c-9393-1cb6112cd881 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.541 259320 INFO neutron.agent.dhcp.agent [-] Starting network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:55:48 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:48.666 2 INFO neutron.agent.securitygroups_rpc [None req-08f39597-7736-4b6c-bf06-18c33436307c 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']#033[00m Feb 1 04:55:48 localhost dnsmasq[311818]: exiting on receipt of SIGTERM Feb 1 04:55:48 localhost systemd[1]: libpod-8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7.scope: Deactivated successfully. Feb 1 04:55:48 localhost podman[311943]: 2026-02-01 09:55:48.713062273 +0000 UTC m=+0.060484121 container kill 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:48 localhost podman[311956]: 2026-02-01 09:55:48.778582719 +0000 UTC m=+0.047193223 container died 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:48 localhost systemd[1]: tmp-crun.igs1SE.mount: Deactivated successfully. Feb 1 04:55:48 localhost podman[311956]: 2026-02-01 09:55:48.830259088 +0000 UTC m=+0.098869602 container remove 8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:55:48 localhost systemd[1]: libpod-conmon-8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7.scope: Deactivated successfully. Feb 1 04:55:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:48.897 259320 INFO neutron.agent.linux.ip_lib [-] Device tap7b25a07f-13 cannot be used as it has no MAC address#033[00m Feb 1 04:55:48 localhost nova_compute[274651]: 2026-02-01 09:55:48.920 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:48 localhost kernel: device tap7b25a07f-13 entered promiscuous mode Feb 1 04:55:48 localhost ovn_controller[152492]: 2026-02-01T09:55:48Z|00199|binding|INFO|Claiming lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e for this chassis. Feb 1 04:55:48 localhost ovn_controller[152492]: 2026-02-01T09:55:48Z|00200|binding|INFO|7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e: Claiming unknown Feb 1 04:55:48 localhost systemd-udevd[311709]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:48 localhost NetworkManager[5964]: [1769939748.9328] manager: (tap7b25a07f-13): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Feb 1 04:55:48 localhost nova_compute[274651]: 2026-02-01 09:55:48.933 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:48.943 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:48 localhost ovn_controller[152492]: 2026-02-01T09:55:48Z|00201|binding|INFO|Setting lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e up in Southbound Feb 1 04:55:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:48.945 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:55:48 localhost nova_compute[274651]: 2026-02-01 09:55:48.946 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:48 localhost ovn_controller[152492]: 2026-02-01T09:55:48Z|00202|binding|INFO|Setting lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e ovn-installed in OVS Feb 1 04:55:48 localhost nova_compute[274651]: 2026-02-01 09:55:48.948 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:48.948 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:48 localhost nova_compute[274651]: 2026-02-01 09:55:48.951 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:48.950 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[92e06609-8955-462f-a160-8bb26ef998c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:48 localhost nova_compute[274651]: 2026-02-01 09:55:48.969 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:49 localhost nova_compute[274651]: 2026-02-01 09:55:49.009 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:49 localhost nova_compute[274651]: 2026-02-01 09:55:49.029 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:49 localhost ovn_controller[152492]: 2026-02-01T09:55:49Z|00203|binding|INFO|Removing iface tapf733571a-0f ovn-installed in OVS Feb 1 04:55:49 localhost ovn_controller[152492]: 2026-02-01T09:55:49Z|00204|binding|INFO|Removing lport f733571a-0fa4-4847-b42a-2c5a85bd0fed ovn-installed in OVS Feb 1 04:55:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:49.663 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port e3184231-aec9-4146-a8d3-1dfa3b8bd59a with type ""#033[00m Feb 1 04:55:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:49.664 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-2e1250bd-5beb-49f3-a522-fdc3f21d998a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e1250bd-5beb-49f3-a522-fdc3f21d998a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6419fd8b712b467ea6e03df22d411fcf', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a14101d-16e6-4045-94e9-474da6ab8d50, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=f733571a-0fa4-4847-b42a-2c5a85bd0fed) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:49.665 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f733571a-0fa4-4847-b42a-2c5a85bd0fed in datapath 2e1250bd-5beb-49f3-a522-fdc3f21d998a unbound from our chassis#033[00m Feb 1 04:55:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:49.665 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2e1250bd-5beb-49f3-a522-fdc3f21d998a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:49 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:49.709 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[196cf260-90a3-423f-a894-5d2b9993e52f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:49 localhost systemd[1]: var-lib-containers-storage-overlay-9742d3985a4d1f674b0f9589c3a068cdd0a0c4f9f8add8334b09ab818ed5ffd4-merged.mount: Deactivated successfully. Feb 1 04:55:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8052fd56faa01c3414849734ffd79f87b74b1691ac3a6eac1a8d199e9e0c11d7-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:49 localhost nova_compute[274651]: 2026-02-01 09:55:49.709 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:49 localhost podman[312037]: Feb 1 04:55:49 localhost podman[312037]: 2026-02-01 09:55:49.767442593 +0000 UTC m=+0.117145924 container create 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:49 localhost nova_compute[274651]: 2026-02-01 09:55:49.778 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:49 localhost nova_compute[274651]: 2026-02-01 09:55:49.782 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:49 localhost systemd[1]: Started libpod-conmon-51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101.scope. Feb 1 04:55:49 localhost podman[312037]: 2026-02-01 09:55:49.725559155 +0000 UTC m=+0.075262526 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:49 localhost systemd[1]: Started libcrun container. Feb 1 04:55:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/271df74e67fa9497d7bc9573e420f8513ab19a6f67e3fbb9e068b46d316802b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:49 localhost podman[312037]: 2026-02-01 09:55:49.839798168 +0000 UTC m=+0.189501509 container init 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:55:49 localhost podman[312037]: 2026-02-01 09:55:49.849715373 +0000 UTC m=+0.199418714 container start 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:49 localhost dnsmasq[312056]: started, version 2.85 cachesize 150 Feb 1 04:55:49 localhost dnsmasq[312056]: DNS service limited to local subnets Feb 1 04:55:49 localhost dnsmasq[312056]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:49 localhost dnsmasq[312056]: warning: no upstream servers configured Feb 1 04:55:49 localhost dnsmasq[312056]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:49.918 259320 INFO neutron.agent.dhcp.agent [-] Finished network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:55:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:49.918 259320 INFO neutron.agent.dhcp.agent [None req-2f6adfb1-1bda-422c-9393-1cb6112cd881 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:55:50 localhost dnsmasq[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/addn_hosts - 0 addresses Feb 1 04:55:50 localhost podman[312074]: 2026-02-01 09:55:50.080569673 +0000 UTC m=+0.047852152 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:50 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/host Feb 1 04:55:50 localhost dnsmasq-dhcp[310983]: read /var/lib/neutron/dhcp/2e1250bd-5beb-49f3-a522-fdc3f21d998a/opts Feb 1 04:55:50 localhost dnsmasq[312056]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:50 localhost podman[312107]: 2026-02-01 09:55:50.190011009 +0000 UTC m=+0.036233965 container kill 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:55:50 localhost ovn_controller[152492]: 2026-02-01T09:55:50Z|00205|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:55:50 localhost nova_compute[274651]: 2026-02-01 09:55:50.282 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:50.360 259320 INFO neutron.agent.dhcp.agent [None req-e7a0f219-cd4e-49ac-ba22-0fbf96515a85 - - - - - -] DHCP configuration for ports {'7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:50 localhost dnsmasq[310983]: exiting on receipt of SIGTERM Feb 1 04:55:50 localhost podman[312149]: 2026-02-01 09:55:50.408235222 +0000 UTC m=+0.041867439 container kill 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:50 localhost systemd[1]: libpod-023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09.scope: Deactivated successfully. Feb 1 04:55:50 localhost podman[312165]: 2026-02-01 09:55:50.474523761 +0000 UTC m=+0.048986398 container died 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:50 localhost podman[312165]: 2026-02-01 09:55:50.511981062 +0000 UTC m=+0.086443659 container remove 023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e1250bd-5beb-49f3-a522-fdc3f21d998a, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:55:50 localhost nova_compute[274651]: 2026-02-01 09:55:50.522 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:50 localhost kernel: device tapf733571a-0f left promiscuous mode Feb 1 04:55:50 localhost systemd[1]: libpod-conmon-023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09.scope: Deactivated successfully. Feb 1 04:55:50 localhost nova_compute[274651]: 2026-02-01 09:55:50.535 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:50.585 259320 INFO neutron.agent.dhcp.agent [None req-16ac184d-40b9-4228-a229-b48a63a6083e - - - - - -] DHCP configuration for ports {'7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:50 localhost dnsmasq[312056]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:50 localhost podman[312209]: 2026-02-01 09:55:50.640522956 +0000 UTC m=+0.060937525 container kill 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:55:50 localhost systemd[1]: tmp-crun.XtvPcp.mount: Deactivated successfully. Feb 1 04:55:50 localhost systemd[1]: var-lib-containers-storage-overlay-e60b2c502ae3d187eeefe6d6b689d8381d3526a09dd20e18ffc2955248f020ad-merged.mount: Deactivated successfully. Feb 1 04:55:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-023791efbbd9645c0f47dccf41d79e3e7281731f49c5ce5202e887f4940b2f09-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:50 localhost systemd[1]: run-netns-qdhcp\x2d2e1250bd\x2d5beb\x2d49f3\x2da522\x2dfdc3f21d998a.mount: Deactivated successfully. Feb 1 04:55:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:50.775 259320 INFO neutron.agent.dhcp.agent [None req-2f6adfb1-1bda-422c-9393-1cb6112cd881 - - - - - -] Synchronizing state#033[00m Feb 1 04:55:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:50.954 259320 INFO neutron.agent.dhcp.agent [None req-0da50a14-ab88-4b6b-8548-b9d4a73fa8d2 - - - - - -] DHCP configuration for ports {'7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:50.979 259320 INFO neutron.agent.dhcp.agent [None req-571afd74-fc0e-4393-b668-cf67d7904cb6 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:55:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:50.994 259320 INFO neutron.agent.dhcp.agent [None req-f47f7f16-2db8-44ac-9a66-9f5941ce981c - - - - - -] Synchronizing state complete#033[00m Feb 1 04:55:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:50.995 259320 INFO neutron.agent.dhcp.agent [None req-84596f07-effa-4ed4-9a9a-4445e6740f09 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:51 localhost nova_compute[274651]: 2026-02-01 09:55:51.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:51 localhost nova_compute[274651]: 2026-02-01 09:55:51.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:55:51 localhost nova_compute[274651]: 2026-02-01 09:55:51.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:55:51 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:51.310 2 INFO neutron.agent.securitygroups_rpc [None req-a98fbbc3-cb77-458d-bae0-4950f59446e4 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:51 localhost nova_compute[274651]: 2026-02-01 09:55:51.372 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:55:51 localhost nova_compute[274651]: 2026-02-01 09:55:51.372 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:55:51 localhost nova_compute[274651]: 2026-02-01 09:55:51.372 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:55:51 localhost nova_compute[274651]: 2026-02-01 09:55:51.372 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:55:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:51.376 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:50Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=94fbcd05-64d6-411b-936b-68be7b0ec7d5, ip_allocation=immediate, mac_address=fa:16:3e:ef:88:2c, name=tempest-NetworksTestDHCPv6-1624331132, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['7f2caf1c-724c-483b-bea9-15ffa260cf68'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:49Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1783, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:51Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:55:51 localhost podman[312248]: 2026-02-01 09:55:51.568738016 +0000 UTC m=+0.056516440 container kill 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:55:51 localhost dnsmasq[312056]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:55:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:51.806 259320 INFO neutron.agent.dhcp.agent [None req-fc609270-878a-4398-b41c-a4986900262f - - - - - -] DHCP configuration for ports {'94fbcd05-64d6-411b-936b-68be7b0ec7d5'} is completed#033[00m Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.110 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.128 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.128 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.129 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:52 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:52.134 2 INFO neutron.agent.securitygroups_rpc [None req-0dd8b6b5-2ecb-4256-a677-3e4c95ec3623 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:52 localhost podman[312284]: 2026-02-01 09:55:52.306112265 +0000 UTC m=+0.048680308 container kill 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:55:52 localhost dnsmasq[312056]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:52 localhost ovn_controller[152492]: 2026-02-01T09:55:52Z|00206|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.752 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:52.845 259320 INFO neutron.agent.linux.ip_lib [None req-38c129e1-597f-4d30-a4e0-d6cd113a4e48 - - - - - -] Device tapf9a6dcb6-d4 cannot be used as it has no MAC address#033[00m Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.869 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:52 localhost kernel: device tapf9a6dcb6-d4 entered promiscuous mode Feb 1 04:55:52 localhost NetworkManager[5964]: [1769939752.8764] manager: (tapf9a6dcb6-d4): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Feb 1 04:55:52 localhost ovn_controller[152492]: 2026-02-01T09:55:52Z|00207|binding|INFO|Claiming lport f9a6dcb6-d460-4dab-9910-ecc3d47fa694 for this chassis. Feb 1 04:55:52 localhost ovn_controller[152492]: 2026-02-01T09:55:52Z|00208|binding|INFO|f9a6dcb6-d460-4dab-9910-ecc3d47fa694: Claiming unknown Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.879 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:52 localhost systemd-udevd[312323]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:55:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:52.889 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d7ee746-bdf6-4729-b69a-ed9e0d539761, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f9a6dcb6-d460-4dab-9910-ecc3d47fa694) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:52.891 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f9a6dcb6-d460-4dab-9910-ecc3d47fa694 in datapath 1e8b1bba-e8b9-4795-804e-ff4f5e0f095e bound to our chassis#033[00m Feb 1 04:55:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:52.892 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1e8b1bba-e8b9-4795-804e-ff4f5e0f095e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:52.895 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a74848-9643-4bd0-a0d2-b58c319c77c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:52 localhost ovn_controller[152492]: 2026-02-01T09:55:52Z|00209|binding|INFO|Setting lport f9a6dcb6-d460-4dab-9910-ecc3d47fa694 ovn-installed in OVS Feb 1 04:55:52 localhost ovn_controller[152492]: 2026-02-01T09:55:52Z|00210|binding|INFO|Setting lport f9a6dcb6-d460-4dab-9910-ecc3d47fa694 up in Southbound Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.902 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.939 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:52 localhost journal[217584]: ethtool ioctl error on tapf9a6dcb6-d4: No such device Feb 1 04:55:52 localhost nova_compute[274651]: 2026-02-01 09:55:52.970 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:52 localhost dnsmasq[312056]: exiting on receipt of SIGTERM Feb 1 04:55:52 localhost systemd[1]: libpod-51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101.scope: Deactivated successfully. Feb 1 04:55:52 localhost podman[312354]: 2026-02-01 09:55:52.9971765 +0000 UTC m=+0.056032514 container kill 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:53 localhost podman[312332]: 2026-02-01 09:55:53.049102187 +0000 UTC m=+0.130387681 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:53 localhost podman[312332]: 2026-02-01 09:55:53.057738613 +0000 UTC m=+0.139024107 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 1 04:55:53 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:55:53 localhost podman[312378]: 2026-02-01 09:55:53.088451578 +0000 UTC m=+0.081948872 container died 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:53 localhost podman[312378]: 2026-02-01 09:55:53.165175147 +0000 UTC m=+0.158672451 container cleanup 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:55:53 localhost systemd[1]: libpod-conmon-51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101.scope: Deactivated successfully. Feb 1 04:55:53 localhost podman[312380]: 2026-02-01 09:55:53.186371609 +0000 UTC m=+0.172472266 container remove 51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:55:53 localhost nova_compute[274651]: 2026-02-01 09:55:53.205 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:53 localhost kernel: device tap7b25a07f-13 left promiscuous mode Feb 1 04:55:53 localhost ovn_controller[152492]: 2026-02-01T09:55:53Z|00211|binding|INFO|Releasing lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e from this chassis (sb_readonly=0) Feb 1 04:55:53 localhost ovn_controller[152492]: 2026-02-01T09:55:53Z|00212|binding|INFO|Setting lport 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e down in Southbound Feb 1 04:55:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:53.220 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:53.222 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7b25a07f-1360-41ed-a5e8-5dc2b4ceb09e in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:55:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:53.222 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:53 localhost nova_compute[274651]: 2026-02-01 09:55:53.228 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:53.228 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[6eae6282-a22d-4e0b-bc40-e18b3f930714]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:53 localhost nova_compute[274651]: 2026-02-01 09:55:53.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:53 localhost nova_compute[274651]: 2026-02-01 09:55:53.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:55:53 localhost nova_compute[274651]: 2026-02-01 09:55:53.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:53 localhost systemd[1]: var-lib-containers-storage-overlay-271df74e67fa9497d7bc9573e420f8513ab19a6f67e3fbb9e068b46d316802b9-merged.mount: Deactivated successfully. Feb 1 04:55:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-51c0c52e3e4ed07dbefd2bd850d3338c7055cc9c0004a9aa8259ed388216a101-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:53 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:55:53 localhost podman[312461]: Feb 1 04:55:53 localhost podman[312461]: 2026-02-01 09:55:53.823124114 +0000 UTC m=+0.089520895 container create e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:55:53 localhost systemd[1]: Started libpod-conmon-e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8.scope. Feb 1 04:55:53 localhost podman[312461]: 2026-02-01 09:55:53.78071794 +0000 UTC m=+0.047114761 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:53 localhost systemd[1]: Started libcrun container. Feb 1 04:55:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14eeb768cd5f5ee5a15fbfa973cb5a7f11aa2d43bfb880403de94f716b9f01f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:53 localhost podman[312461]: 2026-02-01 09:55:53.897125791 +0000 UTC m=+0.163522572 container init e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 04:55:53 localhost podman[312461]: 2026-02-01 09:55:53.906160718 +0000 UTC m=+0.172557499 container start e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:55:53 localhost dnsmasq[312479]: started, version 2.85 cachesize 150 Feb 1 04:55:53 localhost dnsmasq[312479]: DNS service limited to local subnets Feb 1 04:55:53 localhost dnsmasq[312479]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:53 localhost dnsmasq[312479]: warning: no upstream servers configured Feb 1 04:55:53 localhost dnsmasq-dhcp[312479]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:55:53 localhost dnsmasq[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/addn_hosts - 0 addresses Feb 1 04:55:53 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/host Feb 1 04:55:53 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/opts Feb 1 04:55:53 localhost podman[236886]: time="2026-02-01T09:55:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:55:53 localhost podman[236886]: @ - - [01/Feb/2026:09:55:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158354 "" "Go-http-client/1.1" Feb 1 04:55:53 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:53.998 259320 INFO neutron.agent.dhcp.agent [None req-38c129e1-597f-4d30-a4e0-d6cd113a4e48 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:52Z, description=, device_id=ca4d5fd2-fcc5-4bbf-84e5-6e063f1f23d4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=06015f68-97a5-4ab1-8708-efc4c051bacd, ip_allocation=immediate, mac_address=fa:16:3e:eb:15:18, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:49Z, description=, dns_domain=, id=1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1255133873, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55330, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1775, status=ACTIVE, subnets=['a54714e5-eebf-47d1-88e3-6188fd4a96f7'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:51Z, vlan_transparent=None, network_id=1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1801, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:52Z on network 1e8b1bba-e8b9-4795-804e-ff4f5e0f095e#033[00m Feb 1 04:55:54 localhost podman[236886]: @ - - [01/Feb/2026:09:55:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19298 "" "Go-http-client/1.1" Feb 1 04:55:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:54.128 259320 INFO neutron.agent.dhcp.agent [None req-1d8f9b83-e2c5-40bf-ba49-794385e0a4db - - - - - -] DHCP configuration for ports {'ed517eb2-5d78-4d1c-a575-1deae9c2b8cd'} is completed#033[00m Feb 1 04:55:54 localhost dnsmasq[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/addn_hosts - 1 addresses Feb 1 04:55:54 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/host Feb 1 04:55:54 localhost podman[312496]: 2026-02-01 09:55:54.172972444 +0000 UTC m=+0.045072257 container kill e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:55:54 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/opts Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.288 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.289 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.289 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.302 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:54.349 259320 INFO neutron.agent.dhcp.agent [None req-38c129e1-597f-4d30-a4e0-d6cd113a4e48 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:52Z, description=, device_id=ca4d5fd2-fcc5-4bbf-84e5-6e063f1f23d4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=06015f68-97a5-4ab1-8708-efc4c051bacd, ip_allocation=immediate, mac_address=fa:16:3e:eb:15:18, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:49Z, description=, dns_domain=, id=1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1255133873, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55330, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1775, status=ACTIVE, subnets=['a54714e5-eebf-47d1-88e3-6188fd4a96f7'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:51Z, vlan_transparent=None, network_id=1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1801, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:52Z on network 1e8b1bba-e8b9-4795-804e-ff4f5e0f095e#033[00m Feb 1 04:55:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:54.360 259320 INFO neutron.agent.linux.ip_lib [None req-d692d2e0-f872-4cb5-abd7-ae3f2696a68a - - - - - -] Device tapa3b91bd3-29 cannot be used as it has no MAC address#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.383 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:54 localhost kernel: device tapa3b91bd3-29 entered promiscuous mode Feb 1 04:55:54 localhost NetworkManager[5964]: [1769939754.3911] manager: (tapa3b91bd3-29): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.393 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:54 localhost ovn_controller[152492]: 2026-02-01T09:55:54Z|00213|binding|INFO|Claiming lport a3b91bd3-29bd-47a8-bcbf-515c30f3539b for this chassis. Feb 1 04:55:54 localhost ovn_controller[152492]: 2026-02-01T09:55:54Z|00214|binding|INFO|a3b91bd3-29bd-47a8-bcbf-515c30f3539b: Claiming unknown Feb 1 04:55:54 localhost ovn_controller[152492]: 2026-02-01T09:55:54Z|00215|binding|INFO|Setting lport a3b91bd3-29bd-47a8-bcbf-515c30f3539b ovn-installed in OVS Feb 1 04:55:54 localhost ovn_controller[152492]: 2026-02-01T09:55:54Z|00216|binding|INFO|Setting lport a3b91bd3-29bd-47a8-bcbf-515c30f3539b up in Southbound Feb 1 04:55:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:54.402 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a3b91bd3-29bd-47a8-bcbf-515c30f3539b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:54.403 158365 INFO neutron.agent.ovn.metadata.agent [-] Port a3b91bd3-29bd-47a8-bcbf-515c30f3539b in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:55:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:54.405 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.405 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:54.406 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[84ee6e10-e93f-4481-be49-309aa1b219b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.428 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:54 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:54.436 2 INFO neutron.agent.securitygroups_rpc [None req-e7867791-cd65-411f-8ed6-a1d97d2d0b42 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.459 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:54.475 259320 INFO neutron.agent.dhcp.agent [None req-3d94ed27-c661-48c4-b668-f07e0882227e - - - - - -] DHCP configuration for ports {'06015f68-97a5-4ab1-8708-efc4c051bacd'} is completed#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.483 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:54 localhost systemd[1]: tmp-crun.Y8sCsK.mount: Deactivated successfully. Feb 1 04:55:54 localhost dnsmasq[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/addn_hosts - 1 addresses Feb 1 04:55:54 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/host Feb 1 04:55:54 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/opts Feb 1 04:55:54 localhost podman[312549]: 2026-02-01 09:55:54.50769742 +0000 UTC m=+0.044012205 container kill e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:55:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:54.744 259320 INFO neutron.agent.dhcp.agent [None req-6167b211-a858-4f25-bd79-bc48b9ff9a91 - - - - - -] DHCP configuration for ports {'06015f68-97a5-4ab1-8708-efc4c051bacd'} is completed#033[00m Feb 1 04:55:54 localhost nova_compute[274651]: 2026-02-01 09:55:54.783 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:55 localhost nova_compute[274651]: 2026-02-01 09:55:55.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:55 localhost nova_compute[274651]: 2026-02-01 09:55:55.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 04:55:55 localhost podman[312618]: Feb 1 04:55:55 localhost podman[312618]: 2026-02-01 09:55:55.28314603 +0000 UTC m=+0.082730575 container create 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:55:55 localhost systemd[1]: Started libpod-conmon-8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013.scope. Feb 1 04:55:55 localhost systemd[1]: Started libcrun container. Feb 1 04:55:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3736d80e60e19cc69bfccbfe8202539fbe2084ec925198688c813931732e54a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:55 localhost podman[312618]: 2026-02-01 09:55:55.247223016 +0000 UTC m=+0.046807581 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:55 localhost podman[312618]: 2026-02-01 09:55:55.350859133 +0000 UTC m=+0.150443708 container init 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:55 localhost podman[312618]: 2026-02-01 09:55:55.365672389 +0000 UTC m=+0.165256934 container start 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:55:55 localhost dnsmasq[312636]: started, version 2.85 cachesize 150 Feb 1 04:55:55 localhost dnsmasq[312636]: DNS service limited to local subnets Feb 1 04:55:55 localhost dnsmasq[312636]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:55 localhost dnsmasq[312636]: warning: no upstream servers configured Feb 1 04:55:55 localhost dnsmasq-dhcp[312636]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:55:55 localhost dnsmasq[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:55 localhost dnsmasq-dhcp[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:55 localhost dnsmasq-dhcp[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:55.414 259320 INFO neutron.agent.dhcp.agent [None req-d692d2e0-f872-4cb5-abd7-ae3f2696a68a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:54Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=36530069-462b-48f1-9c4c-2bd0245cea99, ip_allocation=immediate, mac_address=fa:16:3e:45:e3:34, name=tempest-NetworksTestDHCPv6-590160003, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['c0958c23-c4b1-4460-94cf-0bc67d1a6fe9'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:53Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1808, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:54Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:55:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:55.573 259320 INFO neutron.agent.dhcp.agent [None req-4ee8e108-e79f-48c7-9db0-f68e510a8f8e - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:55 localhost dnsmasq[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:55:55 localhost dnsmasq-dhcp[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:55 localhost podman[312655]: 2026-02-01 09:55:55.60727586 +0000 UTC m=+0.065517836 container kill 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:55 localhost dnsmasq-dhcp[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:55.829 259320 INFO neutron.agent.dhcp.agent [None req-5b9d2ef4-5761-4ec5-b878-aca68dadf895 - - - - - -] DHCP configuration for ports {'36530069-462b-48f1-9c4c-2bd0245cea99'} is completed#033[00m Feb 1 04:55:55 localhost nova_compute[274651]: 2026-02-01 09:55:55.829 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:55 localhost kernel: device tapa3b91bd3-29 left promiscuous mode Feb 1 04:55:55 localhost ovn_controller[152492]: 2026-02-01T09:55:55Z|00217|binding|INFO|Releasing lport a3b91bd3-29bd-47a8-bcbf-515c30f3539b from this chassis (sb_readonly=0) Feb 1 04:55:55 localhost ovn_controller[152492]: 2026-02-01T09:55:55Z|00218|binding|INFO|Setting lport a3b91bd3-29bd-47a8-bcbf-515c30f3539b down in Southbound Feb 1 04:55:55 localhost neutron_sriov_agent[252126]: 2026-02-01 09:55:55.831 2 INFO neutron.agent.securitygroups_rpc [None req-58b746b2-1860-41ad-b399-3d8dcfe6ba21 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:55:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:55.842 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a3b91bd3-29bd-47a8-bcbf-515c30f3539b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:55.844 158365 INFO neutron.agent.ovn.metadata.agent [-] Port a3b91bd3-29bd-47a8-bcbf-515c30f3539b in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:55:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:55.845 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:55.845 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ed6a26-33d5-4afb-b691-451536f1eb81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:55 localhost nova_compute[274651]: 2026-02-01 09:55:55.851 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:56 localhost dnsmasq[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:55:56 localhost dnsmasq-dhcp[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:55:56 localhost dnsmasq-dhcp[312636]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:55:56 localhost podman[312695]: 2026-02-01 09:55:56.016611159 +0000 UTC m=+0.057828878 container kill 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent [None req-d692d2e0-f872-4cb5-abd7-ae3f2696a68a - - - - - -] Unable to reload_allocations dhcp for cba39058-6a05-4f77-add1-57334b728a66.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa3b91bd3-29 not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa3b91bd3-29 not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.034 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.290 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.311 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.312 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.313 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.313 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.314 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:55:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:55:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:56.666 259320 INFO neutron.agent.linux.ip_lib [None req-728e3bd3-19c8-446c-b6d4-bdd268a420eb - - - - - -] Device tap28590e08-17 cannot be used as it has no MAC address#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.687 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:56 localhost kernel: device tap28590e08-17 entered promiscuous mode Feb 1 04:55:56 localhost NetworkManager[5964]: [1769939756.6924] manager: (tap28590e08-17): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Feb 1 04:55:56 localhost ovn_controller[152492]: 2026-02-01T09:55:56Z|00219|binding|INFO|Claiming lport 28590e08-17c9-4156-a0ba-f3d9e4f54942 for this chassis. Feb 1 04:55:56 localhost ovn_controller[152492]: 2026-02-01T09:55:56Z|00220|binding|INFO|28590e08-17c9-4156-a0ba-f3d9e4f54942: Claiming unknown Feb 1 04:55:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:56.709 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-73d40e5d-eb5c-4a0d-bdff-f74d931fe379', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73d40e5d-eb5c-4a0d-bdff-f74d931fe379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80942481-52b4-4969-ab38-623a6bc77eb5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=28590e08-17c9-4156-a0ba-f3d9e4f54942) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:56.716 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 28590e08-17c9-4156-a0ba-f3d9e4f54942 in datapath 73d40e5d-eb5c-4a0d-bdff-f74d931fe379 bound to our chassis#033[00m Feb 1 04:55:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:56.717 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 73d40e5d-eb5c-4a0d-bdff-f74d931fe379 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:56.720 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[b84f1926-a7f8-4fa7-ab8c-9906dc704d9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:56 localhost ovn_controller[152492]: 2026-02-01T09:55:56Z|00221|binding|INFO|Setting lport 28590e08-17c9-4156-a0ba-f3d9e4f54942 ovn-installed in OVS Feb 1 04:55:56 localhost ovn_controller[152492]: 2026-02-01T09:55:56Z|00222|binding|INFO|Setting lport 28590e08-17c9-4156-a0ba-f3d9e4f54942 up in Southbound Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.738 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.767 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.795 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:55:56 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3319678924' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.861 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.929 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:55:56 localhost nova_compute[274651]: 2026-02-01 09:55:56.929 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:55:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:57.079 259320 INFO neutron.agent.linux.ip_lib [None req-a74094f2-90c1-47b4-87a0-09a2abae7051 - - - - - -] Device tape8a7bb12-41 cannot be used as it has no MAC address#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.141 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:57 localhost kernel: device tape8a7bb12-41 entered promiscuous mode Feb 1 04:55:57 localhost NetworkManager[5964]: [1769939757.1503] manager: (tape8a7bb12-41): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Feb 1 04:55:57 localhost ovn_controller[152492]: 2026-02-01T09:55:57Z|00223|binding|INFO|Claiming lport e8a7bb12-41d0-493e-8905-bb091d847a27 for this chassis. Feb 1 04:55:57 localhost ovn_controller[152492]: 2026-02-01T09:55:57Z|00224|binding|INFO|e8a7bb12-41d0-493e-8905-bb091d847a27: Claiming unknown Feb 1 04:55:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:57.169 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aae0906c-8ae9-45a5-9457-947ab24ff57f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e8a7bb12-41d0-493e-8905-bb091d847a27) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:57.171 158365 INFO neutron.agent.ovn.metadata.agent [-] Port e8a7bb12-41d0-493e-8905-bb091d847a27 in datapath 2f9efe12-9ed6-4df5-ad0b-3a29f13baff1 bound to our chassis#033[00m Feb 1 04:55:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:57.173 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2f9efe12-9ed6-4df5-ad0b-3a29f13baff1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:57.174 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[3e7a5b55-0850-4e01-9f68-ea6d49b159e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:57 localhost ovn_controller[152492]: 2026-02-01T09:55:57Z|00225|binding|INFO|Setting lport e8a7bb12-41d0-493e-8905-bb091d847a27 ovn-installed in OVS Feb 1 04:55:57 localhost ovn_controller[152492]: 2026-02-01T09:55:57Z|00226|binding|INFO|Setting lport e8a7bb12-41d0-493e-8905-bb091d847a27 up in Southbound Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.182 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.205 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.207 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11289MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.208 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.208 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.226 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.250 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.481 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.482 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.483 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.557 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.635 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.636 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.670 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 04:55:57 localhost podman[312830]: Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.695 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI2,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 04:55:57 localhost podman[312830]: 2026-02-01 09:55:57.697885721 +0000 UTC m=+0.090703171 container create c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:55:57 localhost systemd[1]: Started libpod-conmon-c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf.scope. Feb 1 04:55:57 localhost nova_compute[274651]: 2026-02-01 09:55:57.746 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:55:57 localhost systemd[1]: Started libcrun container. Feb 1 04:55:57 localhost podman[312830]: 2026-02-01 09:55:57.652937978 +0000 UTC m=+0.045755458 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e91a7f6b4f1a742de3bcc67c82eeef8a0a1edeaa5481c47c75b4b71995a59e32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:57 localhost podman[312830]: 2026-02-01 09:55:57.78041985 +0000 UTC m=+0.173237300 container init c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:55:57 localhost dnsmasq[312854]: started, version 2.85 cachesize 150 Feb 1 04:55:57 localhost dnsmasq[312854]: DNS service limited to local subnets Feb 1 04:55:57 localhost podman[312830]: 2026-02-01 09:55:57.790176679 +0000 UTC m=+0.182994129 container start c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:57 localhost dnsmasq[312854]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:57 localhost dnsmasq[312854]: warning: no upstream servers configured Feb 1 04:55:57 localhost dnsmasq-dhcp[312854]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Feb 1 04:55:57 localhost dnsmasq[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/addn_hosts - 0 addresses Feb 1 04:55:57 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/host Feb 1 04:55:57 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/opts Feb 1 04:55:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:57.918 259320 INFO neutron.agent.dhcp.agent [None req-0d48db2a-7c27-4926-afd1-19270254e012 - - - - - -] DHCP configuration for ports {'1b2afc8f-3d84-4a0d-9219-5c83bf4d4ef2'} is completed#033[00m Feb 1 04:55:58 localhost podman[312896]: Feb 1 04:55:58 localhost podman[312896]: 2026-02-01 09:55:58.066422046 +0000 UTC m=+0.083483609 container create f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:55:58 localhost systemd[1]: Started libpod-conmon-f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af.scope. Feb 1 04:55:58 localhost systemd[1]: Started libcrun container. Feb 1 04:55:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c8cf761e94bda6ef624d1c28bf6d850c2a4322c80dbee550c06047f4914a305/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:55:58 localhost podman[312896]: 2026-02-01 09:55:58.122656826 +0000 UTC m=+0.139718419 container init f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:55:58 localhost podman[312896]: 2026-02-01 09:55:58.028747448 +0000 UTC m=+0.045809021 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:55:58 localhost podman[312896]: 2026-02-01 09:55:58.132379656 +0000 UTC m=+0.149441249 container start f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:55:58 localhost dnsmasq[312915]: started, version 2.85 cachesize 150 Feb 1 04:55:58 localhost dnsmasq[312915]: DNS service limited to local subnets Feb 1 04:55:58 localhost dnsmasq[312915]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:55:58 localhost dnsmasq[312915]: warning: no upstream servers configured Feb 1 04:55:58 localhost dnsmasq-dhcp[312915]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:55:58 localhost dnsmasq[312915]: read /var/lib/neutron/dhcp/2f9efe12-9ed6-4df5-ad0b-3a29f13baff1/addn_hosts - 0 addresses Feb 1 04:55:58 localhost dnsmasq-dhcp[312915]: read /var/lib/neutron/dhcp/2f9efe12-9ed6-4df5-ad0b-3a29f13baff1/host Feb 1 04:55:58 localhost dnsmasq-dhcp[312915]: read /var/lib/neutron/dhcp/2f9efe12-9ed6-4df5-ad0b-3a29f13baff1/opts Feb 1 04:55:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:55:58 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3775181733' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:55:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:58.195 259320 INFO neutron.agent.dhcp.agent [None req-f47f7f16-2db8-44ac-9a66-9f5941ce981c - - - - - -] Synchronizing state#033[00m Feb 1 04:55:58 localhost nova_compute[274651]: 2026-02-01 09:55:58.205 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:55:58 localhost nova_compute[274651]: 2026-02-01 09:55:58.212 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:55:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:58.244 259320 INFO neutron.agent.dhcp.agent [None req-c9b0018e-922f-476d-b3e0-afa472f55f07 - - - - - -] DHCP configuration for ports {'a5ccdbe3-fbb6-46c0-a664-321ed90db2b9'} is completed#033[00m Feb 1 04:55:58 localhost nova_compute[274651]: 2026-02-01 09:55:58.246 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:55:58 localhost nova_compute[274651]: 2026-02-01 09:55:58.267 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:55:58 localhost nova_compute[274651]: 2026-02-01 09:55:58.267 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:55:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:58.681 259320 INFO neutron.agent.dhcp.agent [None req-9fc753e0-d196-48fe-8834-73acbf586808 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:55:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:58.683 259320 INFO neutron.agent.dhcp.agent [-] Starting network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:55:58 localhost dnsmasq[312636]: exiting on receipt of SIGTERM Feb 1 04:55:58 localhost podman[312935]: 2026-02-01 09:55:58.84555606 +0000 UTC m=+0.050173104 container kill 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:58 localhost systemd[1]: libpod-8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013.scope: Deactivated successfully. Feb 1 04:55:58 localhost podman[312950]: 2026-02-01 09:55:58.900197171 +0000 UTC m=+0.036745772 container died 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:55:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:58 localhost systemd[1]: var-lib-containers-storage-overlay-a3736d80e60e19cc69bfccbfe8202539fbe2084ec925198688c813931732e54a-merged.mount: Deactivated successfully. Feb 1 04:55:58 localhost podman[312950]: 2026-02-01 09:55:58.939648225 +0000 UTC m=+0.076196816 container remove 8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:55:58 localhost systemd[1]: libpod-conmon-8bf0f9eb4a3d4bb3e34b61d73f26ffd1961b4875fa982b7878b4637592114013.scope: Deactivated successfully. Feb 1 04:55:59 localhost ovn_controller[152492]: 2026-02-01T09:55:59Z|00227|binding|INFO|Removing iface tape8a7bb12-41 ovn-installed in OVS Feb 1 04:55:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:59.082 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f730dcd9-9d67-4365-9935-03d4f855d591 with type ""#033[00m Feb 1 04:55:59 localhost ovn_controller[152492]: 2026-02-01T09:55:59Z|00228|binding|INFO|Removing lport e8a7bb12-41d0-493e-8905-bb091d847a27 ovn-installed in OVS Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.083 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:59.084 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aae0906c-8ae9-45a5-9457-947ab24ff57f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e8a7bb12-41d0-493e-8905-bb091d847a27) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:55:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:59.086 158365 INFO neutron.agent.ovn.metadata.agent [-] Port e8a7bb12-41d0-493e-8905-bb091d847a27 in datapath 2f9efe12-9ed6-4df5-ad0b-3a29f13baff1 unbound from our chassis#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.087 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:59.088 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2f9efe12-9ed6-4df5-ad0b-3a29f13baff1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:55:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:55:59.089 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[c5df81e3-1858-4a7d-b28c-ab0d50a6b83f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.164 259320 INFO neutron.agent.dhcp.agent [None req-ecd09607-2cf7-49ae-900e-fac9bab890df - - - - - -] Finished network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.165 259320 INFO neutron.agent.dhcp.agent [None req-9fc753e0-d196-48fe-8834-73acbf586808 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.167 259320 INFO neutron.agent.dhcp.agent [None req-728e3bd3-19c8-446c-b6d4-bdd268a420eb - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:55Z, description=, device_id=ca4d5fd2-fcc5-4bbf-84e5-6e063f1f23d4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ea6da748-118e-4154-b1eb-283580884453, ip_allocation=immediate, mac_address=fa:16:3e:6e:58:0f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:54Z, description=, dns_domain=, id=73d40e5d-eb5c-4a0d-bdff-f74d931fe379, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1089915203, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61701, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1822, status=ACTIVE, subnets=['410cc8f9-7d9f-42b7-b83f-ee85facca073'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:55Z, vlan_transparent=None, network_id=73d40e5d-eb5c-4a0d-bdff-f74d931fe379, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1834, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:55Z on network 73d40e5d-eb5c-4a0d-bdff-f74d931fe379#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.246 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.247 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.265 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.301 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.319 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Triggering sync for uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.321 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.322 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.326 259320 INFO neutron.agent.dhcp.agent [None req-7a7a270e-c2e8-4b32-838a-5bab3b918693 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.343 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:55:59 localhost dnsmasq[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/addn_hosts - 1 addresses Feb 1 04:55:59 localhost podman[312998]: 2026-02-01 09:55:59.41484452 +0000 UTC m=+0.108441576 container kill c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:55:59 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/host Feb 1 04:55:59 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/opts Feb 1 04:55:59 localhost dnsmasq[312915]: exiting on receipt of SIGTERM Feb 1 04:55:59 localhost podman[313022]: 2026-02-01 09:55:59.443821122 +0000 UTC m=+0.060055299 container kill f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true) Feb 1 04:55:59 localhost systemd[1]: libpod-f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af.scope: Deactivated successfully. Feb 1 04:55:59 localhost podman[313039]: 2026-02-01 09:55:59.498707189 +0000 UTC m=+0.042833648 container died f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:55:59 localhost podman[313039]: 2026-02-01 09:55:59.529888528 +0000 UTC m=+0.074014937 container cleanup f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:55:59 localhost systemd[1]: libpod-conmon-f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af.scope: Deactivated successfully. Feb 1 04:55:59 localhost ovn_controller[152492]: 2026-02-01T09:55:59Z|00229|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.570 259320 INFO neutron.agent.dhcp.agent [None req-728e3bd3-19c8-446c-b6d4-bdd268a420eb - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:55Z, description=, device_id=ca4d5fd2-fcc5-4bbf-84e5-6e063f1f23d4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ea6da748-118e-4154-b1eb-283580884453, ip_allocation=immediate, mac_address=fa:16:3e:6e:58:0f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:54Z, description=, dns_domain=, id=73d40e5d-eb5c-4a0d-bdff-f74d931fe379, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1089915203, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61701, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1822, status=ACTIVE, subnets=['410cc8f9-7d9f-42b7-b83f-ee85facca073'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:55Z, vlan_transparent=None, network_id=73d40e5d-eb5c-4a0d-bdff-f74d931fe379, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1834, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:55Z on network 73d40e5d-eb5c-4a0d-bdff-f74d931fe379#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.634 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:59 localhost podman[313041]: 2026-02-01 09:55:59.644314468 +0000 UTC m=+0.177731267 container remove f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f9efe12-9ed6-4df5-ad0b-3a29f13baff1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.654 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:59 localhost kernel: device tape8a7bb12-41 left promiscuous mode Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.669 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.698 259320 INFO neutron.agent.dhcp.agent [None req-0bc067be-f389-4ba2-a826-e778dbe73a21 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.699 259320 INFO neutron.agent.dhcp.agent [None req-0bc067be-f389-4ba2-a826-e778dbe73a21 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:55:59 localhost systemd[1]: var-lib-containers-storage-overlay-3c8cf761e94bda6ef624d1c28bf6d850c2a4322c80dbee550c06047f4914a305-merged.mount: Deactivated successfully. Feb 1 04:55:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1fe88f562871e21d19a37b629387bf78bbbeaf969552f5ddced4decc39aa3af-userdata-shm.mount: Deactivated successfully. Feb 1 04:55:59 localhost systemd[1]: run-netns-qdhcp\x2d2f9efe12\x2d9ed6\x2d4df5\x2dad0b\x2d3a29f13baff1.mount: Deactivated successfully. Feb 1 04:55:59 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:55:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:55:59.738 259320 INFO neutron.agent.dhcp.agent [None req-425d2f18-7630-496c-9d0a-f282cbe26f88 - - - - - -] DHCP configuration for ports {'ea6da748-118e-4154-b1eb-283580884453'} is completed#033[00m Feb 1 04:55:59 localhost nova_compute[274651]: 2026-02-01 09:55:59.786 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:55:59 localhost dnsmasq[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/addn_hosts - 1 addresses Feb 1 04:55:59 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/host Feb 1 04:55:59 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/opts Feb 1 04:55:59 localhost podman[313095]: 2026-02-01 09:55:59.81573314 +0000 UTC m=+0.065908768 container kill c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 04:56:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:00.137 259320 INFO neutron.agent.dhcp.agent [None req-6303518a-a5d4-4177-b553-6668d9fe6db2 - - - - - -] DHCP configuration for ports {'ea6da748-118e-4154-b1eb-283580884453'} is completed#033[00m Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.296 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 04:56:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:00.389 259320 INFO neutron.agent.linux.ip_lib [None req-754c9bc1-5bed-4733-b8c0-330ad7bab41e - - - - - -] Device tap5538cc90-87 cannot be used as it has no MAC address#033[00m Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.413 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:00 localhost kernel: device tap5538cc90-87 entered promiscuous mode Feb 1 04:56:00 localhost NetworkManager[5964]: [1769939760.4192] manager: (tap5538cc90-87): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Feb 1 04:56:00 localhost ovn_controller[152492]: 2026-02-01T09:56:00Z|00230|binding|INFO|Claiming lport 5538cc90-873e-4e0c-8fe6-990549b40e09 for this chassis. Feb 1 04:56:00 localhost ovn_controller[152492]: 2026-02-01T09:56:00Z|00231|binding|INFO|5538cc90-873e-4e0c-8fe6-990549b40e09: Claiming unknown Feb 1 04:56:00 localhost systemd-udevd[313126]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.426 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:00 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:00.430 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5538cc90-873e-4e0c-8fe6-990549b40e09) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:00 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:00.433 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 5538cc90-873e-4e0c-8fe6-990549b40e09 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:56:00 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:00.434 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:00 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:00.437 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[15cdcc07-2d50-4eb7-a752-1fe86f95d472]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:56:00 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4057334485' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost ovn_controller[152492]: 2026-02-01T09:56:00Z|00232|binding|INFO|Setting lport 5538cc90-873e-4e0c-8fe6-990549b40e09 ovn-installed in OVS Feb 1 04:56:00 localhost ovn_controller[152492]: 2026-02-01T09:56:00Z|00233|binding|INFO|Setting lport 5538cc90-873e-4e0c-8fe6-990549b40e09 up in Southbound Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.463 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost journal[217584]: ethtool ioctl error on tap5538cc90-87: No such device Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.501 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:00 localhost nova_compute[274651]: 2026-02-01 09:56:00.528 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e129 do_prune osdmap full prune enabled Feb 1 04:56:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e130 e130: 6 total, 6 up, 6 in Feb 1 04:56:01 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e130: 6 total, 6 up, 6 in Feb 1 04:56:01 localhost podman[313197]: Feb 1 04:56:01 localhost podman[313197]: 2026-02-01 09:56:01.356287333 +0000 UTC m=+0.076876945 container create 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:56:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:56:01 localhost systemd[1]: Started libpod-conmon-0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998.scope. Feb 1 04:56:01 localhost systemd[1]: tmp-crun.bJGVLl.mount: Deactivated successfully. Feb 1 04:56:01 localhost podman[313197]: 2026-02-01 09:56:01.310883527 +0000 UTC m=+0.031473159 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:01 localhost systemd[1]: Started libcrun container. Feb 1 04:56:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/087bdde299d5fa12c39b546b47c93a07c9dd9b3cbb9530c477b905b0a0482aeb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:01 localhost podman[313211]: 2026-02-01 09:56:01.457940871 +0000 UTC m=+0.062000879 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:56:01 localhost podman[313211]: 2026-02-01 09:56:01.465920925 +0000 UTC m=+0.069980933 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:56:01 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:56:01 localhost podman[313197]: 2026-02-01 09:56:01.4849268 +0000 UTC m=+0.205516412 container init 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:56:01 localhost podman[313197]: 2026-02-01 09:56:01.49403036 +0000 UTC m=+0.214619982 container start 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:01 localhost dnsmasq[313238]: started, version 2.85 cachesize 150 Feb 1 04:56:01 localhost dnsmasq[313238]: DNS service limited to local subnets Feb 1 04:56:01 localhost openstack_network_exporter[239441]: ERROR 09:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:56:01 localhost dnsmasq[313238]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:01 localhost openstack_network_exporter[239441]: Feb 1 04:56:01 localhost dnsmasq[313238]: warning: no upstream servers configured Feb 1 04:56:01 localhost openstack_network_exporter[239441]: ERROR 09:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:56:01 localhost dnsmasq-dhcp[313238]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:56:01 localhost openstack_network_exporter[239441]: Feb 1 04:56:01 localhost dnsmasq[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:01 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:01 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e130 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:01 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:01.915 259320 INFO neutron.agent.dhcp.agent [None req-e21c1ddc-7805-4efb-b880-58fdfacd99ec - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:56:02 localhost podman[313257]: 2026-02-01 09:56:02.091652992 +0000 UTC m=+0.057772589 container kill 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:56:02 localhost dnsmasq[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:56:02 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:02 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:02.244 259320 INFO neutron.agent.dhcp.agent [None req-df8f7687-9274-45ba-98d5-40e13d03dcdd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:00Z, description=, device_id=c9391d73-20a6-4fca-b34c-059d8996f8be, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cd103450-57e4-4a6b-a92d-98e21be46fb6, ip_allocation=immediate, mac_address=fa:16:3e:f6:a3:4c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['a7cd3b9a-5a95-4d91-8839-cda29357b159'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:59Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=False, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1866, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:00Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:56:02 localhost nova_compute[274651]: 2026-02-01 09:56:02.253 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e130 do_prune osdmap full prune enabled Feb 1 04:56:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e131 e131: 6 total, 6 up, 6 in Feb 1 04:56:02 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e131: 6 total, 6 up, 6 in Feb 1 04:56:02 localhost systemd[1]: tmp-crun.QDeCja.mount: Deactivated successfully. Feb 1 04:56:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:02.414 259320 INFO neutron.agent.dhcp.agent [None req-60ebba9b-ba86-4fa4-bd2e-418fbd1689e3 - - - - - -] DHCP configuration for ports {'5538cc90-873e-4e0c-8fe6-990549b40e09', 'cd103450-57e4-4a6b-a92d-98e21be46fb6', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:56:02 localhost systemd[1]: tmp-crun.b7Q3Nc.mount: Deactivated successfully. Feb 1 04:56:02 localhost dnsmasq[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:56:02 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:02 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:02 localhost podman[313295]: 2026-02-01 09:56:02.447780735 +0000 UTC m=+0.071489500 container kill 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:56:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:02.581 259320 INFO neutron.agent.dhcp.agent [None req-df8f7687-9274-45ba-98d5-40e13d03dcdd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:00Z, description=, device_id=c9391d73-20a6-4fca-b34c-059d8996f8be, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cd103450-57e4-4a6b-a92d-98e21be46fb6, ip_allocation=immediate, mac_address=fa:16:3e:f6:a3:4c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['a7cd3b9a-5a95-4d91-8839-cda29357b159'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:55:59Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=False, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1866, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:00Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:56:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:02.714 259320 INFO neutron.agent.dhcp.agent [None req-91ff9a71-ff9a-48c8-a4bf-a5df4b9739c1 - - - - - -] DHCP configuration for ports {'cd103450-57e4-4a6b-a92d-98e21be46fb6'} is completed#033[00m Feb 1 04:56:02 localhost podman[313332]: 2026-02-01 09:56:02.762768473 +0000 UTC m=+0.047445131 container kill 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:56:02 localhost dnsmasq[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:56:02 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:02 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:03.017 259320 INFO neutron.agent.dhcp.agent [None req-eb7b89e7-9dec-4d92-bf91-bec8daa94d99 - - - - - -] DHCP configuration for ports {'cd103450-57e4-4a6b-a92d-98e21be46fb6'} is completed#033[00m Feb 1 04:56:03 localhost dnsmasq[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:03 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:03 localhost podman[313371]: 2026-02-01 09:56:03.207946275 +0000 UTC m=+0.048700828 container kill 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:03 localhost dnsmasq-dhcp[313238]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e131 do_prune osdmap full prune enabled Feb 1 04:56:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e132 e132: 6 total, 6 up, 6 in Feb 1 04:56:03 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e132: 6 total, 6 up, 6 in Feb 1 04:56:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e132 do_prune osdmap full prune enabled Feb 1 04:56:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e133 e133: 6 total, 6 up, 6 in Feb 1 04:56:04 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e133: 6 total, 6 up, 6 in Feb 1 04:56:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:56:04 localhost podman[313411]: 2026-02-01 09:56:04.681490307 +0000 UTC m=+0.038785494 container kill 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:56:04 localhost dnsmasq[313238]: exiting on receipt of SIGTERM Feb 1 04:56:04 localhost systemd[1]: libpod-0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998.scope: Deactivated successfully. Feb 1 04:56:04 localhost podman[313441]: 2026-02-01 09:56:04.728689199 +0000 UTC m=+0.032055586 container died 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:56:04 localhost nova_compute[274651]: 2026-02-01 09:56:04.822 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:04 localhost systemd[1]: var-lib-containers-storage-overlay-087bdde299d5fa12c39b546b47c93a07c9dd9b3cbb9530c477b905b0a0482aeb-merged.mount: Deactivated successfully. Feb 1 04:56:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:04 localhost podman[313441]: 2026-02-01 09:56:04.870502121 +0000 UTC m=+0.173868508 container remove 0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:04 localhost systemd[1]: libpod-conmon-0ea40e1e80e23607a1d54d9e7ac2e69010b5b8c89bc075b807e1677e1d454998.scope: Deactivated successfully. Feb 1 04:56:04 localhost kernel: device tap5538cc90-87 left promiscuous mode Feb 1 04:56:04 localhost nova_compute[274651]: 2026-02-01 09:56:04.881 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:04 localhost podman[313408]: 2026-02-01 09:56:04.831167312 +0000 UTC m=+0.189278624 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent) Feb 1 04:56:04 localhost ovn_controller[152492]: 2026-02-01T09:56:04Z|00234|binding|INFO|Releasing lport 5538cc90-873e-4e0c-8fe6-990549b40e09 from this chassis (sb_readonly=0) Feb 1 04:56:04 localhost ovn_controller[152492]: 2026-02-01T09:56:04Z|00235|binding|INFO|Setting lport 5538cc90-873e-4e0c-8fe6-990549b40e09 down in Southbound Feb 1 04:56:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:04.892 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5538cc90-873e-4e0c-8fe6-990549b40e09) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:04.893 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 5538cc90-873e-4e0c-8fe6-990549b40e09 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:56:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:04.894 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:04.895 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[869b48e6-8607-4c50-a7fc-f8211127ccf0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:04 localhost nova_compute[274651]: 2026-02-01 09:56:04.899 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:04 localhost podman[313408]: 2026-02-01 09:56:04.911971407 +0000 UTC m=+0.270082689 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:56:04 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:56:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:05.132 259320 INFO neutron.agent.dhcp.agent [None req-0047bd59-04e7-4e8d-bd8f-870b6d3ededc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:05 localhost ovn_controller[152492]: 2026-02-01T09:56:05Z|00236|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:56:05 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:05.232 2 INFO neutron.agent.securitygroups_rpc [None req-33d595e3-b3a6-4bc9-b70b-e120045130a2 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:56:05 localhost nova_compute[274651]: 2026-02-01 09:56:05.246 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:05 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:56:05 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:05.890 2 INFO neutron.agent.securitygroups_rpc [None req-4789720c-93ff-4d5d-a9e8-dc630a3e4cba 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:56:05 localhost nova_compute[274651]: 2026-02-01 09:56:05.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:06 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:06.753 259320 INFO neutron.agent.linux.ip_lib [None req-7b502039-07f1-4239-abc8-3a68f70c93d6 - - - - - -] Device tapc9742047-e0 cannot be used as it has no MAC address#033[00m Feb 1 04:56:06 localhost nova_compute[274651]: 2026-02-01 09:56:06.769 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:06 localhost kernel: device tapc9742047-e0 entered promiscuous mode Feb 1 04:56:06 localhost NetworkManager[5964]: [1769939766.7759] manager: (tapc9742047-e0): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Feb 1 04:56:06 localhost nova_compute[274651]: 2026-02-01 09:56:06.776 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:06 localhost ovn_controller[152492]: 2026-02-01T09:56:06Z|00237|binding|INFO|Claiming lport c9742047-e02a-4c7a-87fb-f151413b53bc for this chassis. Feb 1 04:56:06 localhost ovn_controller[152492]: 2026-02-01T09:56:06Z|00238|binding|INFO|c9742047-e02a-4c7a-87fb-f151413b53bc: Claiming unknown Feb 1 04:56:06 localhost systemd-udevd[313480]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:56:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:06.792 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c9742047-e02a-4c7a-87fb-f151413b53bc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:06.794 158365 INFO neutron.agent.ovn.metadata.agent [-] Port c9742047-e02a-4c7a-87fb-f151413b53bc in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:56:06 localhost nova_compute[274651]: 2026-02-01 09:56:06.796 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:06.796 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:06.797 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[7dfb9220-95ce-4376-bcae-56db5953c3cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:06 localhost ovn_controller[152492]: 2026-02-01T09:56:06Z|00239|binding|INFO|Setting lport c9742047-e02a-4c7a-87fb-f151413b53bc ovn-installed in OVS Feb 1 04:56:06 localhost ovn_controller[152492]: 2026-02-01T09:56:06Z|00240|binding|INFO|Setting lport c9742047-e02a-4c7a-87fb-f151413b53bc up in Southbound Feb 1 04:56:06 localhost nova_compute[274651]: 2026-02-01 09:56:06.816 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:06 localhost nova_compute[274651]: 2026-02-01 09:56:06.848 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:06 localhost nova_compute[274651]: 2026-02-01 09:56:06.876 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:56:07 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3348501364' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:56:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:56:07 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3348501364' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:56:07 localhost podman[313535]: Feb 1 04:56:07 localhost podman[313535]: 2026-02-01 09:56:07.580503373 +0000 UTC m=+0.078316510 container create 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Feb 1 04:56:07 localhost systemd[1]: Started libpod-conmon-87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788.scope. Feb 1 04:56:07 localhost podman[313535]: 2026-02-01 09:56:07.539830193 +0000 UTC m=+0.037643360 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:07 localhost systemd[1]: Started libcrun container. Feb 1 04:56:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb8d8184e255788a2327a2b1645fdeedd54a65c4d48b16c2ee492acd7f238425/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:07 localhost podman[313535]: 2026-02-01 09:56:07.655769008 +0000 UTC m=+0.153582145 container init 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:56:07 localhost podman[313535]: 2026-02-01 09:56:07.666979393 +0000 UTC m=+0.164792540 container start 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:07 localhost dnsmasq[313553]: started, version 2.85 cachesize 150 Feb 1 04:56:07 localhost dnsmasq[313553]: DNS service limited to local subnets Feb 1 04:56:07 localhost dnsmasq[313553]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:07 localhost dnsmasq[313553]: warning: no upstream servers configured Feb 1 04:56:07 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:07.728 259320 INFO neutron.agent.dhcp.agent [None req-7b502039-07f1-4239-abc8-3a68f70c93d6 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:06Z, description=, device_id=bf471697-b21b-4874-a2b6-0de9eeb6bdf0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f3f3571d-f825-418d-8e94-ae8aa78fde15, ip_allocation=immediate, mac_address=fa:16:3e:fa:de:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['4d1ffd12-2c63-420b-b0ae-d49a096ad8a2'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:05Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=False, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1898, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:06Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:56:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:07.876 259320 INFO neutron.agent.dhcp.agent [None req-9686c7cf-b724-4953-8e92-93b83bac9daa - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:56:07 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:56:07 localhost podman[313572]: 2026-02-01 09:56:07.936023808 +0000 UTC m=+0.067143366 container kill 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:56:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:56:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:08.390 259320 INFO neutron.agent.dhcp.agent [None req-26d2546f-903b-4cf5-8288-28dceb377ed5 - - - - - -] DHCP configuration for ports {'f3f3571d-f825-418d-8e94-ae8aa78fde15'} is completed#033[00m Feb 1 04:56:08 localhost podman[313592]: 2026-02-01 09:56:08.477217223 +0000 UTC m=+0.085846880 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:56:08 localhost podman[313592]: 2026-02-01 09:56:08.482284459 +0000 UTC m=+0.090914046 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:56:08 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:56:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:08.523 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:06Z, description=, device_id=bf471697-b21b-4874-a2b6-0de9eeb6bdf0, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f3f3571d-f825-418d-8e94-ae8aa78fde15, ip_allocation=immediate, mac_address=fa:16:3e:fa:de:77, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['4d1ffd12-2c63-420b-b0ae-d49a096ad8a2'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:05Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=False, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1898, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:06Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:56:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:08.655 259320 INFO neutron.agent.linux.ip_lib [None req-b56f74d0-3ace-4491-8ec6-0390d1c1d41d - - - - - -] Device tap4223e357-90 cannot be used as it has no MAC address#033[00m Feb 1 04:56:08 localhost nova_compute[274651]: 2026-02-01 09:56:08.686 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:08 localhost kernel: device tap4223e357-90 entered promiscuous mode Feb 1 04:56:08 localhost NetworkManager[5964]: [1769939768.6952] manager: (tap4223e357-90): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Feb 1 04:56:08 localhost systemd[1]: tmp-crun.ygtiXP.mount: Deactivated successfully. Feb 1 04:56:08 localhost podman[313636]: 2026-02-01 09:56:08.701048038 +0000 UTC m=+0.054595640 container kill 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:56:08 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:56:08 localhost nova_compute[274651]: 2026-02-01 09:56:08.701 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:08 localhost ovn_controller[152492]: 2026-02-01T09:56:08Z|00241|binding|INFO|Claiming lport 4223e357-90fd-4ec7-96e8-b798510a78c7 for this chassis. Feb 1 04:56:08 localhost ovn_controller[152492]: 2026-02-01T09:56:08Z|00242|binding|INFO|4223e357-90fd-4ec7-96e8-b798510a78c7: Claiming unknown Feb 1 04:56:08 localhost ovn_controller[152492]: 2026-02-01T09:56:08Z|00243|binding|INFO|Setting lport 4223e357-90fd-4ec7-96e8-b798510a78c7 ovn-installed in OVS Feb 1 04:56:08 localhost nova_compute[274651]: 2026-02-01 09:56:08.739 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:08 localhost ovn_controller[152492]: 2026-02-01T09:56:08Z|00244|binding|INFO|Setting lport 4223e357-90fd-4ec7-96e8-b798510a78c7 up in Southbound Feb 1 04:56:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:08.758 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-0db0f199-70cd-458e-a4c8-80a105dc0346', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0db0f199-70cd-458e-a4c8-80a105dc0346', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f9a6b351-759b-46ec-8c55-6a201e89959d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4223e357-90fd-4ec7-96e8-b798510a78c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:08.761 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 4223e357-90fd-4ec7-96e8-b798510a78c7 in datapath 0db0f199-70cd-458e-a4c8-80a105dc0346 bound to our chassis#033[00m Feb 1 04:56:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:08.763 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0db0f199-70cd-458e-a4c8-80a105dc0346 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:08.765 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a0467b-6b67-4358-8504-c77b742618a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:08 localhost nova_compute[274651]: 2026-02-01 09:56:08.784 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:08 localhost nova_compute[274651]: 2026-02-01 09:56:08.815 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:09.480 259320 INFO neutron.agent.dhcp.agent [None req-b7bfb02f-8774-46be-857b-c73e08f3f088 - - - - - -] DHCP configuration for ports {'f3f3571d-f825-418d-8e94-ae8aa78fde15'} is completed#033[00m Feb 1 04:56:09 localhost podman[313713]: Feb 1 04:56:09 localhost podman[313713]: 2026-02-01 09:56:09.590172315 +0000 UTC m=+0.082620292 container create ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db0f199-70cd-458e-a4c8-80a105dc0346, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:56:09 localhost systemd[1]: Started libpod-conmon-ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8.scope. Feb 1 04:56:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e133 do_prune osdmap full prune enabled Feb 1 04:56:09 localhost podman[313713]: 2026-02-01 09:56:09.539711033 +0000 UTC m=+0.032159040 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e134 e134: 6 total, 6 up, 6 in Feb 1 04:56:09 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e134: 6 total, 6 up, 6 in Feb 1 04:56:09 localhost systemd[1]: Started libcrun container. Feb 1 04:56:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b60a3af8b60d3caa5566265bf37651aeaedb27ab811906ee57bbf70efa68a03d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:09 localhost podman[313713]: 2026-02-01 09:56:09.666269916 +0000 UTC m=+0.158717863 container init ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db0f199-70cd-458e-a4c8-80a105dc0346, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:56:09 localhost systemd[1]: tmp-crun.AQRzXv.mount: Deactivated successfully. Feb 1 04:56:09 localhost podman[313713]: 2026-02-01 09:56:09.676948374 +0000 UTC m=+0.169396321 container start ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db0f199-70cd-458e-a4c8-80a105dc0346, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:56:09 localhost dnsmasq[313742]: started, version 2.85 cachesize 150 Feb 1 04:56:09 localhost dnsmasq[313742]: DNS service limited to local subnets Feb 1 04:56:09 localhost dnsmasq[313742]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:09 localhost dnsmasq[313742]: warning: no upstream servers configured Feb 1 04:56:09 localhost dnsmasq-dhcp[313742]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:56:09 localhost dnsmasq[313742]: read /var/lib/neutron/dhcp/0db0f199-70cd-458e-a4c8-80a105dc0346/addn_hosts - 0 addresses Feb 1 04:56:09 localhost dnsmasq-dhcp[313742]: read /var/lib/neutron/dhcp/0db0f199-70cd-458e-a4c8-80a105dc0346/host Feb 1 04:56:09 localhost dnsmasq-dhcp[313742]: read /var/lib/neutron/dhcp/0db0f199-70cd-458e-a4c8-80a105dc0346/opts Feb 1 04:56:09 localhost podman[313727]: 2026-02-01 09:56:09.733582766 +0000 UTC m=+0.098183471 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:56:09 localhost nova_compute[274651]: 2026-02-01 09:56:09.793 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:09 localhost podman[313727]: 2026-02-01 09:56:09.809345576 +0000 UTC m=+0.173946261 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:56:09 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:56:09 localhost nova_compute[274651]: 2026-02-01 09:56:09.826 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:09.920 259320 INFO neutron.agent.dhcp.agent [None req-892415f3-ecbd-49e1-9f25-e18f7b40680c - - - - - -] DHCP configuration for ports {'78a3bc5a-64f4-4495-8e55-b06b7ed1ce02'} is completed#033[00m Feb 1 04:56:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:56:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3348857733' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:56:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:56:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3348857733' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:56:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e134 do_prune osdmap full prune enabled Feb 1 04:56:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e135 e135: 6 total, 6 up, 6 in Feb 1 04:56:11 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e135: 6 total, 6 up, 6 in Feb 1 04:56:11 localhost dnsmasq[313553]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:11 localhost podman[313774]: 2026-02-01 09:56:11.713841674 +0000 UTC m=+0.067242260 container kill 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:56:11 localhost ovn_controller[152492]: 2026-02-01T09:56:11Z|00245|binding|INFO|Releasing lport c9742047-e02a-4c7a-87fb-f151413b53bc from this chassis (sb_readonly=0) Feb 1 04:56:11 localhost kernel: device tapc9742047-e0 left promiscuous mode Feb 1 04:56:11 localhost ovn_controller[152492]: 2026-02-01T09:56:11Z|00246|binding|INFO|Setting lport c9742047-e02a-4c7a-87fb-f151413b53bc down in Southbound Feb 1 04:56:11 localhost nova_compute[274651]: 2026-02-01 09:56:11.965 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:11 localhost nova_compute[274651]: 2026-02-01 09:56:11.993 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.029 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c9742047-e02a-4c7a-87fb-f151413b53bc) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.030 158365 INFO neutron.agent.ovn.metadata.agent [-] Port c9742047-e02a-4c7a-87fb-f151413b53bc in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.031 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cba39058-6a05-4f77-add1-57334b728a66 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.032 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[06073194-7fdf-4f1e-aa8f-566738aa8b7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:56:12 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/878563889' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:56:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:56:12 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/878563889' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:56:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:12.542 259320 INFO neutron.agent.linux.ip_lib [None req-bc977bb5-cc74-4d4d-bc85-b09b0e375d5b - - - - - -] Device tap261614d7-19 cannot be used as it has no MAC address#033[00m Feb 1 04:56:12 localhost nova_compute[274651]: 2026-02-01 09:56:12.597 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:12 localhost kernel: device tap261614d7-19 entered promiscuous mode Feb 1 04:56:12 localhost NetworkManager[5964]: [1769939772.6059] manager: (tap261614d7-19): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Feb 1 04:56:12 localhost ovn_controller[152492]: 2026-02-01T09:56:12Z|00247|binding|INFO|Claiming lport 261614d7-1958-4fae-988b-7647205e6bde for this chassis. Feb 1 04:56:12 localhost ovn_controller[152492]: 2026-02-01T09:56:12Z|00248|binding|INFO|261614d7-1958-4fae-988b-7647205e6bde: Claiming unknown Feb 1 04:56:12 localhost nova_compute[274651]: 2026-02-01 09:56:12.607 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:12 localhost systemd-udevd[313807]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.621 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cb4a956b-bc5f-4fc2-a846-6bda9e647532', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb4a956b-bc5f-4fc2-a846-6bda9e647532', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63ec16b-f9b0-4104-911e-2b117b15fe5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=261614d7-1958-4fae-988b-7647205e6bde) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.622 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 261614d7-1958-4fae-988b-7647205e6bde in datapath cb4a956b-bc5f-4fc2-a846-6bda9e647532 bound to our chassis#033[00m Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.625 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cb4a956b-bc5f-4fc2-a846-6bda9e647532 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:12 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:12.627 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[88847727-ea15-471c-ba66-99cb01aed4b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:12 localhost ovn_controller[152492]: 2026-02-01T09:56:12Z|00249|binding|INFO|Setting lport 261614d7-1958-4fae-988b-7647205e6bde ovn-installed in OVS Feb 1 04:56:12 localhost ovn_controller[152492]: 2026-02-01T09:56:12Z|00250|binding|INFO|Setting lport 261614d7-1958-4fae-988b-7647205e6bde up in Southbound Feb 1 04:56:12 localhost nova_compute[274651]: 2026-02-01 09:56:12.654 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:12 localhost nova_compute[274651]: 2026-02-01 09:56:12.693 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:12 localhost nova_compute[274651]: 2026-02-01 09:56:12.724 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:13 localhost ovn_controller[152492]: 2026-02-01T09:56:13Z|00251|binding|INFO|Removing iface tap261614d7-19 ovn-installed in OVS Feb 1 04:56:13 localhost ovn_controller[152492]: 2026-02-01T09:56:13Z|00252|binding|INFO|Removing lport 261614d7-1958-4fae-988b-7647205e6bde ovn-installed in OVS Feb 1 04:56:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:13.030 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 769b390b-7ee8-4c1f-ac2b-e88741364b49 with type ""#033[00m Feb 1 04:56:13 localhost nova_compute[274651]: 2026-02-01 09:56:13.031 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:13.032 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cb4a956b-bc5f-4fc2-a846-6bda9e647532', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb4a956b-bc5f-4fc2-a846-6bda9e647532', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b63ec16b-f9b0-4104-911e-2b117b15fe5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=261614d7-1958-4fae-988b-7647205e6bde) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:13.034 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 261614d7-1958-4fae-988b-7647205e6bde in datapath cb4a956b-bc5f-4fc2-a846-6bda9e647532 unbound from our chassis#033[00m Feb 1 04:56:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:13.035 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cb4a956b-bc5f-4fc2-a846-6bda9e647532 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:13.035 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ffb65d74-94f0-42e8-9139-16f0df603787]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:13 localhost nova_compute[274651]: 2026-02-01 09:56:13.036 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:13 localhost podman[313860]: Feb 1 04:56:13 localhost podman[313860]: 2026-02-01 09:56:13.498903047 +0000 UTC m=+0.068349854 container create 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:13 localhost systemd[1]: Started libpod-conmon-299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55.scope. Feb 1 04:56:13 localhost systemd[1]: Started libcrun container. Feb 1 04:56:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13dd5e6e486b37c689fe18318da0d5f981b5c2f310e91b7f177055b67b9d0aa3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:13 localhost podman[313860]: 2026-02-01 09:56:13.46030457 +0000 UTC m=+0.029751437 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:13 localhost podman[313860]: 2026-02-01 09:56:13.563146133 +0000 UTC m=+0.132592940 container init 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:56:13 localhost podman[313860]: 2026-02-01 09:56:13.574314596 +0000 UTC m=+0.143761373 container start 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 1 04:56:13 localhost dnsmasq[313889]: started, version 2.85 cachesize 150 Feb 1 04:56:13 localhost dnsmasq[313889]: DNS service limited to local subnets Feb 1 04:56:13 localhost dnsmasq[313889]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:13 localhost dnsmasq[313889]: warning: no upstream servers configured Feb 1 04:56:13 localhost dnsmasq-dhcp[313889]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:56:13 localhost dnsmasq[313889]: read /var/lib/neutron/dhcp/cb4a956b-bc5f-4fc2-a846-6bda9e647532/addn_hosts - 0 addresses Feb 1 04:56:13 localhost dnsmasq-dhcp[313889]: read /var/lib/neutron/dhcp/cb4a956b-bc5f-4fc2-a846-6bda9e647532/host Feb 1 04:56:13 localhost dnsmasq-dhcp[313889]: read /var/lib/neutron/dhcp/cb4a956b-bc5f-4fc2-a846-6bda9e647532/opts Feb 1 04:56:13 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:13.679 259320 INFO neutron.agent.dhcp.agent [None req-a997fba1-7bea-4ada-bd14-fdf69c3d839e - - - - - -] DHCP configuration for ports {'4fe03981-e1a2-466c-9dde-ba2fd2714648'} is completed#033[00m Feb 1 04:56:13 localhost nova_compute[274651]: 2026-02-01 09:56:13.690 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:13 localhost kernel: device tap261614d7-19 left promiscuous mode Feb 1 04:56:13 localhost dnsmasq[313553]: exiting on receipt of SIGTERM Feb 1 04:56:13 localhost podman[313896]: 2026-02-01 09:56:13.693205753 +0000 UTC m=+0.079466535 container kill 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:13 localhost systemd[1]: libpod-87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788.scope: Deactivated successfully. Feb 1 04:56:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e135 do_prune osdmap full prune enabled Feb 1 04:56:13 localhost nova_compute[274651]: 2026-02-01 09:56:13.705 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:13 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e136 e136: 6 total, 6 up, 6 in Feb 1 04:56:13 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e136: 6 total, 6 up, 6 in Feb 1 04:56:13 localhost podman[313912]: 2026-02-01 09:56:13.779696543 +0000 UTC m=+0.071188780 container died 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:56:13 localhost podman[313912]: 2026-02-01 09:56:13.811830942 +0000 UTC m=+0.103323159 container cleanup 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 1 04:56:13 localhost systemd[1]: libpod-conmon-87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788.scope: Deactivated successfully. Feb 1 04:56:13 localhost podman[313913]: 2026-02-01 09:56:13.852865863 +0000 UTC m=+0.135626792 container remove 87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:56:13 localhost ovn_controller[152492]: 2026-02-01T09:56:13Z|00253|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:56:13 localhost nova_compute[274651]: 2026-02-01 09:56:13.985 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:14 localhost dnsmasq[313889]: read /var/lib/neutron/dhcp/cb4a956b-bc5f-4fc2-a846-6bda9e647532/addn_hosts - 0 addresses Feb 1 04:56:14 localhost dnsmasq-dhcp[313889]: read /var/lib/neutron/dhcp/cb4a956b-bc5f-4fc2-a846-6bda9e647532/host Feb 1 04:56:14 localhost dnsmasq-dhcp[313889]: read /var/lib/neutron/dhcp/cb4a956b-bc5f-4fc2-a846-6bda9e647532/opts Feb 1 04:56:14 localhost podman[313956]: 2026-02-01 09:56:14.054346331 +0000 UTC m=+0.048839673 container kill 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for cb4a956b-bc5f-4fc2-a846-6bda9e647532.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap261614d7-19 not found in namespace qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532. Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap261614d7-19 not found in namespace qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532. Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.076 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.136 259320 INFO neutron.agent.dhcp.agent [None req-9fc753e0-d196-48fe-8834-73acbf586808 - - - - - -] Synchronizing state#033[00m Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.329 259320 INFO neutron.agent.dhcp.agent [None req-2d31bef9-341d-47bf-89b7-34b3f9c84401 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.330 259320 INFO neutron.agent.dhcp.agent [-] Starting network cb4a956b-bc5f-4fc2-a846-6bda9e647532 dhcp configuration#033[00m Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.333 259320 INFO neutron.agent.dhcp.agent [-] Starting network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.334 259320 INFO neutron.agent.dhcp.agent [-] Finished network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:56:14 localhost systemd[1]: tmp-crun.1hHmAj.mount: Deactivated successfully. Feb 1 04:56:14 localhost systemd[1]: var-lib-containers-storage-overlay-bb8d8184e255788a2327a2b1645fdeedd54a65c4d48b16c2ee492acd7f238425-merged.mount: Deactivated successfully. Feb 1 04:56:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87bae939bb78f46c284cacd8189fcd9c91baf4617b2e9da1e94b3ceb00db3788-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:14 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:56:14 localhost dnsmasq[313889]: exiting on receipt of SIGTERM Feb 1 04:56:14 localhost podman[313986]: 2026-02-01 09:56:14.511960655 +0000 UTC m=+0.063165793 container kill 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:56:14 localhost systemd[1]: libpod-299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55.scope: Deactivated successfully. Feb 1 04:56:14 localhost podman[314001]: 2026-02-01 09:56:14.588103178 +0000 UTC m=+0.057489890 container died 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:14 localhost podman[314001]: 2026-02-01 09:56:14.614540221 +0000 UTC m=+0.083926903 container cleanup 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:56:14 localhost systemd[1]: libpod-conmon-299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55.scope: Deactivated successfully. Feb 1 04:56:14 localhost podman[314002]: 2026-02-01 09:56:14.658818433 +0000 UTC m=+0.124730668 container remove 299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb4a956b-bc5f-4fc2-a846-6bda9e647532, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.724 259320 INFO neutron.agent.dhcp.agent [None req-2a323d66-9fbc-4642-a910-ad99bb33a313 - - - - - -] Finished network cb4a956b-bc5f-4fc2-a846-6bda9e647532 dhcp configuration#033[00m Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.725 259320 INFO neutron.agent.dhcp.agent [None req-2d31bef9-341d-47bf-89b7-34b3f9c84401 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:56:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:14.726 259320 INFO neutron.agent.dhcp.agent [None req-987a585f-7ec9-411a-b695-b25b592b6870 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:14 localhost nova_compute[274651]: 2026-02-01 09:56:14.868 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:15 localhost systemd[1]: var-lib-containers-storage-overlay-13dd5e6e486b37c689fe18318da0d5f981b5c2f310e91b7f177055b67b9d0aa3-merged.mount: Deactivated successfully. Feb 1 04:56:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-299d94e787ae673374122b294522ec2a233f53712b0cff568da099cc87533f55-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:15 localhost systemd[1]: run-netns-qdhcp\x2dcb4a956b\x2dbc5f\x2d4fc2\x2da846\x2d6bda9e647532.mount: Deactivated successfully. Feb 1 04:56:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e136 do_prune osdmap full prune enabled Feb 1 04:56:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e137 e137: 6 total, 6 up, 6 in Feb 1 04:56:15 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e137: 6 total, 6 up, 6 in Feb 1 04:56:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:56:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:16.216 259320 INFO neutron.agent.linux.ip_lib [None req-cc37b1d0-7b1e-4fb4-b0fb-580def5c8e0d - - - - - -] Device tapfcb21783-99 cannot be used as it has no MAC address#033[00m Feb 1 04:56:16 localhost nova_compute[274651]: 2026-02-01 09:56:16.283 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:16 localhost kernel: device tapfcb21783-99 entered promiscuous mode Feb 1 04:56:16 localhost NetworkManager[5964]: [1769939776.2898] manager: (tapfcb21783-99): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Feb 1 04:56:16 localhost systemd-udevd[314054]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:56:16 localhost nova_compute[274651]: 2026-02-01 09:56:16.293 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:16 localhost ovn_controller[152492]: 2026-02-01T09:56:16Z|00254|binding|INFO|Claiming lport fcb21783-999a-439a-8247-7e417db55a21 for this chassis. Feb 1 04:56:16 localhost ovn_controller[152492]: 2026-02-01T09:56:16Z|00255|binding|INFO|fcb21783-999a-439a-8247-7e417db55a21: Claiming unknown Feb 1 04:56:16 localhost podman[314030]: 2026-02-01 09:56:16.300410994 +0000 UTC m=+0.142896327 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 04:56:16 localhost podman[314030]: 2026-02-01 09:56:16.313908289 +0000 UTC m=+0.156393582 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, version=9.7, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, release=1769056855, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z) Feb 1 04:56:16 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:56:16 localhost ovn_controller[152492]: 2026-02-01T09:56:16Z|00256|binding|INFO|Setting lport fcb21783-999a-439a-8247-7e417db55a21 ovn-installed in OVS Feb 1 04:56:16 localhost nova_compute[274651]: 2026-02-01 09:56:16.334 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:16 localhost nova_compute[274651]: 2026-02-01 09:56:16.365 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:16 localhost nova_compute[274651]: 2026-02-01 09:56:16.390 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:16 localhost ovn_controller[152492]: 2026-02-01T09:56:16Z|00257|binding|INFO|Setting lport fcb21783-999a-439a-8247-7e417db55a21 up in Southbound Feb 1 04:56:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:16.394 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef1:c1db/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fcb21783-999a-439a-8247-7e417db55a21) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:16.394 158365 INFO neutron.agent.ovn.metadata.agent [-] Port fcb21783-999a-439a-8247-7e417db55a21 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:56:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:16.396 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2c8acb4c-81ce-46a5-8f7d-cb894b49c6bc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:56:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:16.396 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:16.397 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ee142718-3665-4463-8da5-15e2deb755d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e137 do_prune osdmap full prune enabled Feb 1 04:56:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e138 e138: 6 total, 6 up, 6 in Feb 1 04:56:16 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e138: 6 total, 6 up, 6 in Feb 1 04:56:17 localhost podman[314114]: Feb 1 04:56:17 localhost podman[314114]: 2026-02-01 09:56:17.235852165 +0000 UTC m=+0.084165540 container create fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:17 localhost systemd[1]: Started libpod-conmon-fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1.scope. Feb 1 04:56:17 localhost systemd[1]: tmp-crun.jAH35G.mount: Deactivated successfully. Feb 1 04:56:17 localhost podman[314114]: 2026-02-01 09:56:17.187223569 +0000 UTC m=+0.035536984 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:17 localhost systemd[1]: Started libcrun container. Feb 1 04:56:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4852e4478bbf6748ee05f392592d218984cf08c336066188c714a651744b872/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:17 localhost podman[314114]: 2026-02-01 09:56:17.3000539 +0000 UTC m=+0.148367235 container init fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:17 localhost podman[314114]: 2026-02-01 09:56:17.313830354 +0000 UTC m=+0.162143699 container start fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:56:17 localhost dnsmasq[314132]: started, version 2.85 cachesize 150 Feb 1 04:56:17 localhost dnsmasq[314132]: DNS service limited to local subnets Feb 1 04:56:17 localhost dnsmasq[314132]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:17 localhost dnsmasq[314132]: warning: no upstream servers configured Feb 1 04:56:17 localhost dnsmasq[314132]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:17 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:17.599 259320 INFO neutron.agent.dhcp.agent [None req-b1d1c1a4-cc55-40eb-b149-8f7c07daa37d - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:56:17 localhost dnsmasq[314132]: exiting on receipt of SIGTERM Feb 1 04:56:17 localhost podman[314149]: 2026-02-01 09:56:17.785386187 +0000 UTC m=+0.064673580 container kill fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 1 04:56:17 localhost systemd[1]: libpod-fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1.scope: Deactivated successfully. Feb 1 04:56:17 localhost podman[314161]: 2026-02-01 09:56:17.85409143 +0000 UTC m=+0.057557951 container died fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:17 localhost podman[314161]: 2026-02-01 09:56:17.885688582 +0000 UTC m=+0.089155073 container cleanup fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:56:17 localhost systemd[1]: libpod-conmon-fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1.scope: Deactivated successfully. Feb 1 04:56:17 localhost podman[314163]: 2026-02-01 09:56:17.933606686 +0000 UTC m=+0.128327528 container remove fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:56:18 localhost systemd[1]: tmp-crun.zt9Im1.mount: Deactivated successfully. Feb 1 04:56:18 localhost systemd[1]: var-lib-containers-storage-overlay-c4852e4478bbf6748ee05f392592d218984cf08c336066188c714a651744b872-merged.mount: Deactivated successfully. Feb 1 04:56:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fea0c53d615318e62039d054a09f209967d35d974a522426e6554d4a1a2383d1-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:18 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:18.673 2 INFO neutron.agent.securitygroups_rpc [None req-f9943192-ce60-4425-aa06-00cabb160f7d 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:56:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:18.969 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:18.972 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:56:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:18.975 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2c8acb4c-81ce-46a5-8f7d-cb894b49c6bc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:56:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:18.975 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:18.977 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d1717f-656b-4fa1-810d-be6fb7a61e8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:19 localhost nova_compute[274651]: 2026-02-01 09:56:19.919 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:20 localhost podman[314240]: Feb 1 04:56:20 localhost podman[314240]: 2026-02-01 09:56:20.231237405 +0000 UTC m=+0.071160220 container create 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:20 localhost systemd[1]: Started libpod-conmon-9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf.scope. Feb 1 04:56:20 localhost systemd[1]: Started libcrun container. Feb 1 04:56:20 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8480c3a183e2ef69717d9b6064cd62a1db38bb2abf54b605c5ba5449c09cd51d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:20 localhost podman[314240]: 2026-02-01 09:56:20.189901113 +0000 UTC m=+0.029823918 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:20 localhost podman[314240]: 2026-02-01 09:56:20.296834442 +0000 UTC m=+0.136757257 container init 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:20 localhost systemd[1]: tmp-crun.Z2OUzI.mount: Deactivated successfully. Feb 1 04:56:20 localhost podman[314240]: 2026-02-01 09:56:20.312854865 +0000 UTC m=+0.152777680 container start 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:56:20 localhost dnsmasq[314258]: started, version 2.85 cachesize 150 Feb 1 04:56:20 localhost dnsmasq[314258]: DNS service limited to local subnets Feb 1 04:56:20 localhost dnsmasq[314258]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:20 localhost dnsmasq[314258]: warning: no upstream servers configured Feb 1 04:56:20 localhost dnsmasq-dhcp[314258]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:56:20 localhost dnsmasq[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:20 localhost dnsmasq-dhcp[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:20 localhost dnsmasq-dhcp[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e138 do_prune osdmap full prune enabled Feb 1 04:56:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e139 e139: 6 total, 6 up, 6 in Feb 1 04:56:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e139: 6 total, 6 up, 6 in Feb 1 04:56:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e139 do_prune osdmap full prune enabled Feb 1 04:56:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e140 e140: 6 total, 6 up, 6 in Feb 1 04:56:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e140: 6 total, 6 up, 6 in Feb 1 04:56:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:21.687 259320 INFO neutron.agent.dhcp.agent [None req-2bca62a5-3276-4a4c-8419-33e30c417937 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:22 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:22.290 2 INFO neutron.agent.securitygroups_rpc [None req-a183bb9b-36ef-42c1-85bf-6ec8456cdf42 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:56:22 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:22.457 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:21Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=06b30752-f8da-47f0-b7d3-641e219e9a85, ip_allocation=immediate, mac_address=fa:16:3e:03:5a:99, name=tempest-NetworksTestDHCPv6-1211547149, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['30d98693-6315-42c7-b302-5ee4fc228b69', 'c55898c8-e0a5-4d7c-96e7-01ff3b25981d'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:16Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1951, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:22Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:56:22 localhost dnsmasq[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:56:22 localhost podman[314277]: 2026-02-01 09:56:22.804960675 +0000 UTC m=+0.058663605 container kill 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:22 localhost dnsmasq-dhcp[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:22 localhost dnsmasq-dhcp[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:23.136 259320 INFO neutron.agent.dhcp.agent [None req-a8ec2002-fcaf-4dd9-b241-5b607e9bdbe7 - - - - - -] DHCP configuration for ports {'06b30752-f8da-47f0-b7d3-641e219e9a85'} is completed#033[00m Feb 1 04:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:56:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e140 do_prune osdmap full prune enabled Feb 1 04:56:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e141 e141: 6 total, 6 up, 6 in Feb 1 04:56:23 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e141: 6 total, 6 up, 6 in Feb 1 04:56:23 localhost podman[314301]: 2026-02-01 09:56:23.743226744 +0000 UTC m=+0.098597324 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute) Feb 1 04:56:23 localhost podman[314301]: 2026-02-01 09:56:23.760381692 +0000 UTC m=+0.115752322 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:56:23 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:56:23 localhost podman[236886]: time="2026-02-01T09:56:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:56:23 localhost podman[236886]: @ - - [01/Feb/2026:09:56:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 163814 "" "Go-http-client/1.1" Feb 1 04:56:24 localhost podman[236886]: @ - - [01/Feb/2026:09:56:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20724 "" "Go-http-client/1.1" Feb 1 04:56:24 localhost nova_compute[274651]: 2026-02-01 09:56:24.921 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:56:24 localhost nova_compute[274651]: 2026-02-01 09:56:24.923 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:56:24 localhost nova_compute[274651]: 2026-02-01 09:56:24.923 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:56:24 localhost nova_compute[274651]: 2026-02-01 09:56:24.923 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:56:24 localhost nova_compute[274651]: 2026-02-01 09:56:24.967 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:24 localhost nova_compute[274651]: 2026-02-01 09:56:24.968 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:56:25 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:25.072 259320 INFO neutron.agent.linux.ip_lib [None req-c004b185-935c-4e4e-b903-f7294df7aadb - - - - - -] Device tapb5bf00c9-5a cannot be used as it has no MAC address#033[00m Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.095 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:25.097 2 INFO neutron.agent.securitygroups_rpc [None req-f4397a72-704c-448d-a51f-22a40616a177 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:56:25 localhost kernel: device tapb5bf00c9-5a entered promiscuous mode Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.102 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost NetworkManager[5964]: [1769939785.1032] manager: (tapb5bf00c9-5a): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Feb 1 04:56:25 localhost ovn_controller[152492]: 2026-02-01T09:56:25Z|00258|binding|INFO|Claiming lport b5bf00c9-5ad8-4fb8-b2d6-77d03521750d for this chassis. Feb 1 04:56:25 localhost ovn_controller[152492]: 2026-02-01T09:56:25Z|00259|binding|INFO|b5bf00c9-5ad8-4fb8-b2d6-77d03521750d: Claiming unknown Feb 1 04:56:25 localhost systemd-udevd[314331]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost ovn_controller[152492]: 2026-02-01T09:56:25Z|00260|binding|INFO|Setting lport b5bf00c9-5ad8-4fb8-b2d6-77d03521750d ovn-installed in OVS Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.140 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.141 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost journal[217584]: ethtool ioctl error on tapb5bf00c9-5a: No such device Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.167 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost ovn_controller[152492]: 2026-02-01T09:56:25Z|00261|binding|INFO|Setting lport b5bf00c9-5ad8-4fb8-b2d6-77d03521750d up in Southbound Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.184 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-4daadc64-798a-4698-bf2d-186bcf44172c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4daadc64-798a-4698-bf2d-186bcf44172c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44c9be65-a890-4bea-9d08-7728bb33562b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b5bf00c9-5ad8-4fb8-b2d6-77d03521750d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.186 158365 INFO neutron.agent.ovn.metadata.agent [-] Port b5bf00c9-5ad8-4fb8-b2d6-77d03521750d in datapath 4daadc64-798a-4698-bf2d-186bcf44172c bound to our chassis#033[00m Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.188 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4daadc64-798a-4698-bf2d-186bcf44172c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.189 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[0404909c-80df-4360-9a83-3195e258edfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.196 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost ovn_controller[152492]: 2026-02-01T09:56:25Z|00262|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.300 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.338 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 96f4298b-537a-4c6d-929c-fe8bafc3b345 with type ""#033[00m Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.339 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-4daadc64-798a-4698-bf2d-186bcf44172c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4daadc64-798a-4698-bf2d-186bcf44172c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=44c9be65-a890-4bea-9d08-7728bb33562b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b5bf00c9-5ad8-4fb8-b2d6-77d03521750d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:25 localhost ovn_controller[152492]: 2026-02-01T09:56:25Z|00263|binding|INFO|Removing iface tapb5bf00c9-5a ovn-installed in OVS Feb 1 04:56:25 localhost ovn_controller[152492]: 2026-02-01T09:56:25Z|00264|binding|INFO|Removing lport b5bf00c9-5ad8-4fb8-b2d6-77d03521750d ovn-installed in OVS Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.340 158365 INFO neutron.agent.ovn.metadata.agent [-] Port b5bf00c9-5ad8-4fb8-b2d6-77d03521750d in datapath 4daadc64-798a-4698-bf2d-186bcf44172c unbound from our chassis#033[00m Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.342 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4daadc64-798a-4698-bf2d-186bcf44172c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:25.343 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[4c23740c-87ca-4982-a6e8-5ff3299c470c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:25 localhost nova_compute[274651]: 2026-02-01 09:56:25.345 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:25 localhost podman[314393]: 2026-02-01 09:56:25.583047691 +0000 UTC m=+0.055141276 container kill 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:56:25 localhost dnsmasq[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:25 localhost dnsmasq-dhcp[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:25 localhost dnsmasq-dhcp[314258]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e141 do_prune osdmap full prune enabled Feb 1 04:56:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e142 e142: 6 total, 6 up, 6 in Feb 1 04:56:25 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e142: 6 total, 6 up, 6 in Feb 1 04:56:26 localhost podman[314443]: Feb 1 04:56:26 localhost podman[314443]: 2026-02-01 09:56:26.060906919 +0000 UTC m=+0.105684442 container create bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:26 localhost systemd[1]: Started libpod-conmon-bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9.scope. Feb 1 04:56:26 localhost podman[314443]: 2026-02-01 09:56:26.0128039 +0000 UTC m=+0.057581463 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:26 localhost systemd[1]: Started libcrun container. Feb 1 04:56:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36ecba6e39fde79844018ee93b57f21d5cc364a0c87b79ecf7845567622f39cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:26 localhost podman[314443]: 2026-02-01 09:56:26.163821375 +0000 UTC m=+0.208598878 container init bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:56:26 localhost podman[314443]: 2026-02-01 09:56:26.17437793 +0000 UTC m=+0.219155433 container start bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:56:26 localhost dnsmasq[314462]: started, version 2.85 cachesize 150 Feb 1 04:56:26 localhost dnsmasq[314462]: DNS service limited to local subnets Feb 1 04:56:26 localhost dnsmasq[314462]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:26 localhost dnsmasq[314462]: warning: no upstream servers configured Feb 1 04:56:26 localhost dnsmasq-dhcp[314462]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:56:26 localhost dnsmasq[314462]: read /var/lib/neutron/dhcp/4daadc64-798a-4698-bf2d-186bcf44172c/addn_hosts - 0 addresses Feb 1 04:56:26 localhost dnsmasq-dhcp[314462]: read /var/lib/neutron/dhcp/4daadc64-798a-4698-bf2d-186bcf44172c/host Feb 1 04:56:26 localhost dnsmasq-dhcp[314462]: read /var/lib/neutron/dhcp/4daadc64-798a-4698-bf2d-186bcf44172c/opts Feb 1 04:56:26 localhost kernel: device tapb5bf00c9-5a left promiscuous mode Feb 1 04:56:26 localhost nova_compute[274651]: 2026-02-01 09:56:26.331 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:26 localhost nova_compute[274651]: 2026-02-01 09:56:26.350 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.360 259320 INFO neutron.agent.dhcp.agent [None req-800cc2cb-c0c7-405f-9fe9-d23cd07c6a5f - - - - - -] DHCP configuration for ports {'20da53aa-1fe5-468d-bd1c-ff1430c625df'} is completed#033[00m Feb 1 04:56:26 localhost podman[314482]: 2026-02-01 09:56:26.513063846 +0000 UTC m=+0.054202728 container kill bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:26 localhost dnsmasq[314462]: read /var/lib/neutron/dhcp/4daadc64-798a-4698-bf2d-186bcf44172c/addn_hosts - 0 addresses Feb 1 04:56:26 localhost dnsmasq-dhcp[314462]: read /var/lib/neutron/dhcp/4daadc64-798a-4698-bf2d-186bcf44172c/host Feb 1 04:56:26 localhost dnsmasq-dhcp[314462]: read /var/lib/neutron/dhcp/4daadc64-798a-4698-bf2d-186bcf44172c/opts Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent [None req-c004b185-935c-4e4e-b903-f7294df7aadb - - - - - -] Unable to reload_allocations dhcp for 4daadc64-798a-4698-bf2d-186bcf44172c.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapb5bf00c9-5a not found in namespace qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c. Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapb5bf00c9-5a not found in namespace qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c. Feb 1 04:56:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:26.531 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:56:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:26 localhost dnsmasq[314258]: exiting on receipt of SIGTERM Feb 1 04:56:26 localhost podman[314513]: 2026-02-01 09:56:26.617007044 +0000 UTC m=+0.042948123 container kill 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:56:26 localhost systemd[1]: libpod-9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf.scope: Deactivated successfully. Feb 1 04:56:26 localhost podman[314527]: 2026-02-01 09:56:26.656710045 +0000 UTC m=+0.030339035 container died 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:26 localhost podman[314527]: 2026-02-01 09:56:26.674981426 +0000 UTC m=+0.048610386 container cleanup 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:26 localhost systemd[1]: libpod-conmon-9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf.scope: Deactivated successfully. Feb 1 04:56:26 localhost podman[314533]: 2026-02-01 09:56:26.734862169 +0000 UTC m=+0.099422360 container remove 9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:56:27 localhost systemd[1]: var-lib-containers-storage-overlay-8480c3a183e2ef69717d9b6064cd62a1db38bb2abf54b605c5ba5449c09cd51d-merged.mount: Deactivated successfully. Feb 1 04:56:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9fdcf13fd6d9328652de8508be5ea49b539462d42a98670345611912fedfdccf-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:27 localhost ovn_controller[152492]: 2026-02-01T09:56:27Z|00265|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:56:27 localhost nova_compute[274651]: 2026-02-01 09:56:27.461 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:27 localhost podman[314603]: Feb 1 04:56:27 localhost podman[314603]: 2026-02-01 09:56:27.492588724 +0000 UTC m=+0.063305309 container create 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:27 localhost systemd[1]: Started libpod-conmon-56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f.scope. Feb 1 04:56:27 localhost systemd[1]: Started libcrun container. Feb 1 04:56:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2b9a02631346bd6ee3d2c77ef4a9ea85462df934fac82079ba4f0ee98f3cbef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:27 localhost podman[314603]: 2026-02-01 09:56:27.558804601 +0000 UTC m=+0.129521206 container init 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:56:27 localhost podman[314603]: 2026-02-01 09:56:27.459148665 +0000 UTC m=+0.029865250 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:27 localhost podman[314603]: 2026-02-01 09:56:27.568385315 +0000 UTC m=+0.139101920 container start 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:56:27 localhost dnsmasq[314622]: started, version 2.85 cachesize 150 Feb 1 04:56:27 localhost dnsmasq[314622]: DNS service limited to local subnets Feb 1 04:56:27 localhost dnsmasq[314622]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:27 localhost dnsmasq[314622]: warning: no upstream servers configured Feb 1 04:56:27 localhost dnsmasq[314622]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:27 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:27.615 259320 INFO neutron.agent.dhcp.agent [None req-2d31bef9-341d-47bf-89b7-34b3f9c84401 - - - - - -] Synchronizing state#033[00m Feb 1 04:56:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:28.111 259320 INFO neutron.agent.dhcp.agent [None req-0362b137-2c0f-4b84-be9e-498c0b6536c3 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:28.131 259320 INFO neutron.agent.dhcp.agent [None req-58c34b6f-7e87-4c09-93f4-f82a70fcf5c6 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:56:28 localhost podman[314641]: 2026-02-01 09:56:28.287631417 +0000 UTC m=+0.055286572 container kill bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:56:28 localhost dnsmasq[314462]: exiting on receipt of SIGTERM Feb 1 04:56:28 localhost systemd[1]: libpod-bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9.scope: Deactivated successfully. Feb 1 04:56:28 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:28.349 2 INFO neutron.agent.securitygroups_rpc [None req-abdb9ca6-56bb-47f8-92cb-3bfb04a52114 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:56:28 localhost podman[314653]: 2026-02-01 09:56:28.367662308 +0000 UTC m=+0.063996579 container died bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:28 localhost podman[314653]: 2026-02-01 09:56:28.397113054 +0000 UTC m=+0.093447265 container cleanup bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:28 localhost systemd[1]: libpod-conmon-bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9.scope: Deactivated successfully. Feb 1 04:56:28 localhost podman[314655]: 2026-02-01 09:56:28.440830079 +0000 UTC m=+0.130980790 container remove bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4daadc64-798a-4698-bf2d-186bcf44172c, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:28.639 259320 INFO neutron.agent.dhcp.agent [None req-95135ad7-31fc-43b3-80f7-a4e33f52d8f2 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:56:29 localhost systemd[1]: var-lib-containers-storage-overlay-36ecba6e39fde79844018ee93b57f21d5cc364a0c87b79ecf7845567622f39cf-merged.mount: Deactivated successfully. Feb 1 04:56:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb1b86e972a8cf852ae219e701ba150e5a83c0582447d5c51b94eac8956446a9-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:29 localhost systemd[1]: run-netns-qdhcp\x2d4daadc64\x2d798a\x2d4698\x2dbf2d\x2d186bcf44172c.mount: Deactivated successfully. Feb 1 04:56:29 localhost podman[314697]: 2026-02-01 09:56:29.166665154 +0000 UTC m=+0.052367962 container kill 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:56:29 localhost dnsmasq[314622]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:29 localhost systemd[1]: tmp-crun.GYNoiC.mount: Deactivated successfully. Feb 1 04:56:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e142 do_prune osdmap full prune enabled Feb 1 04:56:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e143 e143: 6 total, 6 up, 6 in Feb 1 04:56:29 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e143: 6 total, 6 up, 6 in Feb 1 04:56:29 localhost nova_compute[274651]: 2026-02-01 09:56:29.991 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:30.147 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:30.149 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:56:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:30.151 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2c8acb4c-81ce-46a5-8f7d-cb894b49c6bc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:56:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:30.152 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:30.152 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[4443c279-1b96-4a84-afe4-302348083304]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:30.289 259320 INFO neutron.agent.dhcp.agent [None req-bbd6425a-e1ee-4296-8dd5-e92765fe8365 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:31.032 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:31 localhost nova_compute[274651]: 2026-02-01 09:56:31.032 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:31.034 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:56:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e143 do_prune osdmap full prune enabled Feb 1 04:56:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e144 e144: 6 total, 6 up, 6 in Feb 1 04:56:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e144: 6 total, 6 up, 6 in Feb 1 04:56:31 localhost openstack_network_exporter[239441]: ERROR 09:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:56:31 localhost openstack_network_exporter[239441]: Feb 1 04:56:31 localhost openstack_network_exporter[239441]: ERROR 09:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:56:31 localhost openstack_network_exporter[239441]: Feb 1 04:56:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e144 do_prune osdmap full prune enabled Feb 1 04:56:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e145 e145: 6 total, 6 up, 6 in Feb 1 04:56:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e145: 6 total, 6 up, 6 in Feb 1 04:56:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:56:31 localhost systemd[1]: tmp-crun.lbmqAj.mount: Deactivated successfully. Feb 1 04:56:31 localhost dnsmasq[314622]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:31 localhost podman[314740]: 2026-02-01 09:56:31.728272382 +0000 UTC m=+0.071731888 container kill 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:31 localhost podman[314733]: 2026-02-01 09:56:31.719256035 +0000 UTC m=+0.079814207 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:56:31 localhost podman[314733]: 2026-02-01 09:56:31.798958046 +0000 UTC m=+0.159516218 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:56:31 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:56:35 localhost nova_compute[274651]: 2026-02-01 09:56:35.035 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:56:35 localhost podman[314777]: 2026-02-01 09:56:35.137284584 +0000 UTC m=+0.072983556 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:35 localhost podman[314777]: 2026-02-01 09:56:35.169227416 +0000 UTC m=+0.104926408 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Feb 1 04:56:35 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:56:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e145 do_prune osdmap full prune enabled Feb 1 04:56:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e146 e146: 6 total, 6 up, 6 in Feb 1 04:56:35 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e146: 6 total, 6 up, 6 in Feb 1 04:56:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:36.006 259320 INFO neutron.agent.dhcp.agent [None req-f6287d3a-11a8-4917-8e0f-46bcbab4d97f - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:36 localhost dnsmasq[314622]: exiting on receipt of SIGTERM Feb 1 04:56:36 localhost podman[314815]: 2026-02-01 09:56:36.458324406 +0000 UTC m=+0.068891650 container kill 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:56:36 localhost systemd[1]: libpod-56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f.scope: Deactivated successfully. Feb 1 04:56:36 localhost podman[314849]: 2026-02-01 09:56:36.538310576 +0000 UTC m=+0.064756153 container died 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:56:36 localhost podman[314849]: 2026-02-01 09:56:36.577970805 +0000 UTC m=+0.104416342 container cleanup 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 1 04:56:36 localhost systemd[1]: libpod-conmon-56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f.scope: Deactivated successfully. Feb 1 04:56:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:36 localhost podman[314858]: 2026-02-01 09:56:36.61874753 +0000 UTC m=+0.136895092 container remove 56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:37.015 2 INFO neutron.agent.securitygroups_rpc [None req-ea765bd5-9cd7-4b20-a560-8c3da0273449 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:56:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:56:37 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:56:37 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:56:37 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:56:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e146 do_prune osdmap full prune enabled Feb 1 04:56:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e147 e147: 6 total, 6 up, 6 in Feb 1 04:56:37 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e147: 6 total, 6 up, 6 in Feb 1 04:56:37 localhost systemd[1]: var-lib-containers-storage-overlay-e2b9a02631346bd6ee3d2c77ef4a9ea85462df934fac82079ba4f0ee98f3cbef-merged.mount: Deactivated successfully. Feb 1 04:56:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56eb73dbd4d02e85481074f8481565d614b7f97e80be9735b250209b9163016f-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e147 do_prune osdmap full prune enabled Feb 1 04:56:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e148 e148: 6 total, 6 up, 6 in Feb 1 04:56:38 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e148: 6 total, 6 up, 6 in Feb 1 04:56:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:38.626 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:38.629 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:56:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:38.632 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2c8acb4c-81ce-46a5-8f7d-cb894b49c6bc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:56:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:38.633 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:38.634 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[05646e4e-f2b6-4a4b-ab60-97deff726244]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:56:38 localhost podman[314940]: 2026-02-01 09:56:38.715040756 +0000 UTC m=+0.070114288 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:56:38 localhost podman[314940]: 2026-02-01 09:56:38.7304454 +0000 UTC m=+0.085518982 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 04:56:38 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:56:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:56:39 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3715535998' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:56:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:56:39 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3715535998' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:56:39 localhost dnsmasq[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/addn_hosts - 0 addresses Feb 1 04:56:39 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/host Feb 1 04:56:39 localhost podman[315006]: 2026-02-01 09:56:39.863633403 +0000 UTC m=+0.060071608 container kill c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:56:39 localhost dnsmasq-dhcp[312854]: read /var/lib/neutron/dhcp/73d40e5d-eb5c-4a0d-bdff-f74d931fe379/opts Feb 1 04:56:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:56:39 localhost systemd[1]: tmp-crun.9gE4SW.mount: Deactivated successfully. Feb 1 04:56:39 localhost podman[315034]: 2026-02-01 09:56:39.998647825 +0000 UTC m=+0.107590019 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.037 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.039 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.040 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.040 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:56:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:40.066 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.066 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.067 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:56:40 localhost podman[315034]: 2026-02-01 09:56:40.086299812 +0000 UTC m=+0.195241976 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:56:40 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:56:40 localhost podman[315052]: Feb 1 04:56:40 localhost podman[315052]: 2026-02-01 09:56:40.116614944 +0000 UTC m=+0.149112557 container create b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:56:40 localhost systemd[1]: Started libpod-conmon-b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb.scope. Feb 1 04:56:40 localhost systemd[1]: Started libcrun container. Feb 1 04:56:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16ca5f87961617b89df7bd993bd8fdb66905b2667bc3acc8de9ae147cf59809c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:40 localhost podman[315052]: 2026-02-01 09:56:40.157172382 +0000 UTC m=+0.189670015 container init b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:40 localhost podman[315052]: 2026-02-01 09:56:40.165698384 +0000 UTC m=+0.198196017 container start b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:40 localhost podman[315052]: 2026-02-01 09:56:40.081505454 +0000 UTC m=+0.114003087 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:40 localhost dnsmasq[315088]: started, version 2.85 cachesize 150 Feb 1 04:56:40 localhost dnsmasq[315088]: DNS service limited to local subnets Feb 1 04:56:40 localhost dnsmasq[315088]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:40 localhost dnsmasq[315088]: warning: no upstream servers configured Feb 1 04:56:40 localhost dnsmasq-dhcp[315088]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:56:40 localhost dnsmasq-dhcp[315088]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:56:40 localhost dnsmasq[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:40 localhost dnsmasq-dhcp[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:40 localhost dnsmasq-dhcp[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:40 localhost ovn_controller[152492]: 2026-02-01T09:56:40Z|00266|binding|INFO|Releasing lport 28590e08-17c9-4156-a0ba-f3d9e4f54942 from this chassis (sb_readonly=0) Feb 1 04:56:40 localhost kernel: device tap28590e08-17 left promiscuous mode Feb 1 04:56:40 localhost ovn_controller[152492]: 2026-02-01T09:56:40Z|00267|binding|INFO|Setting lport 28590e08-17c9-4156-a0ba-f3d9e4f54942 down in Southbound Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.313 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:40 localhost nova_compute[274651]: 2026-02-01 09:56:40.332 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e148 do_prune osdmap full prune enabled Feb 1 04:56:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e149 e149: 6 total, 6 up, 6 in Feb 1 04:56:40 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e149: 6 total, 6 up, 6 in Feb 1 04:56:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:40.590 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-73d40e5d-eb5c-4a0d-bdff-f74d931fe379', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-73d40e5d-eb5c-4a0d-bdff-f74d931fe379', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80942481-52b4-4969-ab38-623a6bc77eb5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=28590e08-17c9-4156-a0ba-f3d9e4f54942) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:40.592 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 28590e08-17c9-4156-a0ba-f3d9e4f54942 in datapath 73d40e5d-eb5c-4a0d-bdff-f74d931fe379 unbound from our chassis#033[00m Feb 1 04:56:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:40.593 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 73d40e5d-eb5c-4a0d-bdff-f74d931fe379 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:40.594 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[86e48740-2ec2-4fd2-a4c4-62ed66340437]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:40.819 259320 INFO neutron.agent.dhcp.agent [None req-1dcf2840-35e3-4599-ae74-d5fa8f66a774 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:41 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:41.529 2 INFO neutron.agent.securitygroups_rpc [None req-fb00d927-fa6d-4c8a-857b-3eb5803ded56 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:56:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:56:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:56:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:41.718 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:56:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:41.719 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:56:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:41.719 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:56:41 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:41.739 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:40Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=4b192acf-9b70-4ce5-928b-1bba489214aa, ip_allocation=immediate, mac_address=fa:16:3e:df:f6:cc, name=tempest-NetworksTestDHCPv6-389374155, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['9df99b5f-2b14-43c9-956d-26728f9d6576', 'a0742fa7-f390-4a5e-bb3f-0735ad42191c'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:34Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=1985, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:40Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:56:41 localhost podman[315107]: 2026-02-01 09:56:41.971710182 +0000 UTC m=+0.048900885 container kill b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:41 localhost dnsmasq[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:56:41 localhost dnsmasq-dhcp[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:41 localhost dnsmasq-dhcp[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:42.302 259320 INFO neutron.agent.dhcp.agent [None req-7bf7ea04-c132-48ba-8ee0-759554af2af8 - - - - - -] DHCP configuration for ports {'4b192acf-9b70-4ce5-928b-1bba489214aa'} is completed#033[00m Feb 1 04:56:42 localhost dnsmasq[312854]: exiting on receipt of SIGTERM Feb 1 04:56:42 localhost podman[315147]: 2026-02-01 09:56:42.478786278 +0000 UTC m=+0.059701457 container kill c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:56:42 localhost systemd[1]: tmp-crun.Zvr22L.mount: Deactivated successfully. Feb 1 04:56:42 localhost systemd[1]: libpod-c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf.scope: Deactivated successfully. Feb 1 04:56:42 localhost podman[315159]: 2026-02-01 09:56:42.53965441 +0000 UTC m=+0.046471160 container died c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:42 localhost podman[315159]: 2026-02-01 09:56:42.576277876 +0000 UTC m=+0.083094596 container cleanup c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:56:42 localhost systemd[1]: libpod-conmon-c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf.scope: Deactivated successfully. Feb 1 04:56:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:42.584 2 INFO neutron.agent.securitygroups_rpc [None req-868efe44-cb2f-4cd4-8b32-db45306b68ea 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:56:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e149 do_prune osdmap full prune enabled Feb 1 04:56:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:56:42 localhost podman[315166]: 2026-02-01 09:56:42.64986816 +0000 UTC m=+0.145130345 container remove c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-73d40e5d-eb5c-4a0d-bdff-f74d931fe379, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:56:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e150 e150: 6 total, 6 up, 6 in Feb 1 04:56:42 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e150: 6 total, 6 up, 6 in Feb 1 04:56:42 localhost systemd[1]: var-lib-containers-storage-overlay-e91a7f6b4f1a742de3bcc67c82eeef8a0a1edeaa5481c47c75b4b71995a59e32-merged.mount: Deactivated successfully. Feb 1 04:56:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5e6ffa513050a19ca6b9de87bcace033fd80be4bd3dd3899cea63a1c2bcbfdf-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:43 localhost systemd[1]: run-netns-qdhcp\x2d73d40e5d\x2deb5c\x2d4a0d\x2dbdff\x2df74d931fe379.mount: Deactivated successfully. Feb 1 04:56:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:43.442 259320 INFO neutron.agent.dhcp.agent [None req-d7436940-fc82-4de0-9fd9-f12fc330b931 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:43.443 259320 INFO neutron.agent.dhcp.agent [None req-d7436940-fc82-4de0-9fd9-f12fc330b931 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:43 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:43.572 2 INFO neutron.agent.securitygroups_rpc [None req-2270150a-743e-4e11-8e45-671bacf25871 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:56:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e150 do_prune osdmap full prune enabled Feb 1 04:56:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e151 e151: 6 total, 6 up, 6 in Feb 1 04:56:43 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e151: 6 total, 6 up, 6 in Feb 1 04:56:43 localhost dnsmasq[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:43 localhost dnsmasq-dhcp[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:43 localhost podman[315201]: 2026-02-01 09:56:43.967655612 +0000 UTC m=+0.041257561 container kill b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:56:43 localhost dnsmasq-dhcp[315088]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:44.416 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:44 localhost ovn_controller[152492]: 2026-02-01T09:56:44Z|00268|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:56:44 localhost nova_compute[274651]: 2026-02-01 09:56:44.965 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:45 localhost nova_compute[274651]: 2026-02-01 09:56:45.066 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:45 localhost nova_compute[274651]: 2026-02-01 09:56:45.068 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:45 localhost systemd[1]: tmp-crun.gLISuk.mount: Deactivated successfully. Feb 1 04:56:45 localhost dnsmasq[315088]: exiting on receipt of SIGTERM Feb 1 04:56:45 localhost podman[315240]: 2026-02-01 09:56:45.559573394 +0000 UTC m=+0.057725736 container kill b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:56:45 localhost systemd[1]: libpod-b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb.scope: Deactivated successfully. Feb 1 04:56:45 localhost podman[315255]: 2026-02-01 09:56:45.613671559 +0000 UTC m=+0.039970751 container died b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:56:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:45 localhost podman[315255]: 2026-02-01 09:56:45.642567917 +0000 UTC m=+0.068867029 container remove b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:45 localhost systemd[1]: libpod-conmon-b2c0815c694247a52c2034ae54fab72b703963d089a381d4ddc61623694c21fb.scope: Deactivated successfully. Feb 1 04:56:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e151 do_prune osdmap full prune enabled Feb 1 04:56:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e152 e152: 6 total, 6 up, 6 in Feb 1 04:56:45 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e152: 6 total, 6 up, 6 in Feb 1 04:56:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:56:46 localhost podman[315332]: Feb 1 04:56:46 localhost podman[315344]: 2026-02-01 09:56:46.436487476 +0000 UTC m=+0.052031441 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, build-date=2026-01-22T05:09:47Z, release=1769056855, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:56:46 localhost podman[315344]: 2026-02-01 09:56:46.447316819 +0000 UTC m=+0.062860774 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, architecture=x86_64, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 04:56:46 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:56:46 localhost podman[315332]: 2026-02-01 09:56:46.468722428 +0000 UTC m=+0.118235298 container create 5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:56:46 localhost podman[315332]: 2026-02-01 09:56:46.386902601 +0000 UTC m=+0.036415521 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:46 localhost systemd[1]: Started libpod-conmon-5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541.scope. Feb 1 04:56:46 localhost systemd[1]: Started libcrun container. Feb 1 04:56:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8d3ceb174d65cf5b5852563d5c505c36a4857cf1ea4cd0aad92f611657c4f7d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:46 localhost podman[315332]: 2026-02-01 09:56:46.510878744 +0000 UTC m=+0.160391614 container init 5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:56:46 localhost podman[315332]: 2026-02-01 09:56:46.516487767 +0000 UTC m=+0.166000627 container start 5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:56:46 localhost dnsmasq[315370]: started, version 2.85 cachesize 150 Feb 1 04:56:46 localhost dnsmasq[315370]: DNS service limited to local subnets Feb 1 04:56:46 localhost dnsmasq[315370]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:46 localhost dnsmasq[315370]: warning: no upstream servers configured Feb 1 04:56:46 localhost dnsmasq-dhcp[315370]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:56:46 localhost dnsmasq[315370]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:46 localhost dnsmasq-dhcp[315370]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:46 localhost dnsmasq-dhcp[315370]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:46 localhost systemd[1]: var-lib-containers-storage-overlay-16ca5f87961617b89df7bd993bd8fdb66905b2667bc3acc8de9ae147cf59809c-merged.mount: Deactivated successfully. Feb 1 04:56:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e152 do_prune osdmap full prune enabled Feb 1 04:56:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e153 e153: 6 total, 6 up, 6 in Feb 1 04:56:46 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e153: 6 total, 6 up, 6 in Feb 1 04:56:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:46.786 259320 INFO neutron.agent.dhcp.agent [None req-9e8a8b1a-fa65-4505-a489-3fc55eaa9eca - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:46 localhost dnsmasq[315370]: exiting on receipt of SIGTERM Feb 1 04:56:46 localhost systemd[1]: libpod-5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541.scope: Deactivated successfully. Feb 1 04:56:46 localhost podman[315389]: 2026-02-01 09:56:46.803939578 +0000 UTC m=+0.047840433 container kill 5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:56:46 localhost podman[315404]: 2026-02-01 09:56:46.866469101 +0000 UTC m=+0.048254175 container died 5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:56:46 localhost podman[315404]: 2026-02-01 09:56:46.891918754 +0000 UTC m=+0.073703748 container cleanup 5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:56:46 localhost systemd[1]: libpod-conmon-5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541.scope: Deactivated successfully. Feb 1 04:56:46 localhost podman[315406]: 2026-02-01 09:56:46.94576179 +0000 UTC m=+0.119522317 container remove 5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:56:47 localhost dnsmasq[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/addn_hosts - 0 addresses Feb 1 04:56:47 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/host Feb 1 04:56:47 localhost dnsmasq-dhcp[312479]: read /var/lib/neutron/dhcp/1e8b1bba-e8b9-4795-804e-ff4f5e0f095e/opts Feb 1 04:56:47 localhost podman[315451]: 2026-02-01 09:56:47.051907225 +0000 UTC m=+0.052203947 container kill e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:56:47 localhost ovn_controller[152492]: 2026-02-01T09:56:47Z|00269|binding|INFO|Releasing lport f9a6dcb6-d460-4dab-9910-ecc3d47fa694 from this chassis (sb_readonly=0) Feb 1 04:56:47 localhost kernel: device tapf9a6dcb6-d4 left promiscuous mode Feb 1 04:56:47 localhost ovn_controller[152492]: 2026-02-01T09:56:47Z|00270|binding|INFO|Setting lport f9a6dcb6-d460-4dab-9910-ecc3d47fa694 down in Southbound Feb 1 04:56:47 localhost nova_compute[274651]: 2026-02-01 09:56:47.192 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:47.208 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3d7ee746-bdf6-4729-b69a-ed9e0d539761, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f9a6dcb6-d460-4dab-9910-ecc3d47fa694) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:47.210 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f9a6dcb6-d460-4dab-9910-ecc3d47fa694 in datapath 1e8b1bba-e8b9-4795-804e-ff4f5e0f095e unbound from our chassis#033[00m Feb 1 04:56:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:47.211 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1e8b1bba-e8b9-4795-804e-ff4f5e0f095e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:47 localhost nova_compute[274651]: 2026-02-01 09:56:47.212 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:47 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:47.212 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f2842d28-4b27-42ba-8a92-305f9794ea08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:47 localhost systemd[1]: var-lib-containers-storage-overlay-b8d3ceb174d65cf5b5852563d5c505c36a4857cf1ea4cd0aad92f611657c4f7d-merged.mount: Deactivated successfully. Feb 1 04:56:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f5f26b3e95e8e44e5c8c94348f6ccfb9ee4e0986f4723a674086c578383d541-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e153 do_prune osdmap full prune enabled Feb 1 04:56:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e154 e154: 6 total, 6 up, 6 in Feb 1 04:56:47 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e154: 6 total, 6 up, 6 in Feb 1 04:56:48 localhost sshd[315475]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:56:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:48.533 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:48.536 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:56:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:48.539 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2c8acb4c-81ce-46a5-8f7d-cb894b49c6bc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:56:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:48.540 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:48.541 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2539a5-7f3a-4a90-be8e-01e739af566c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e154 do_prune osdmap full prune enabled Feb 1 04:56:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e155 e155: 6 total, 6 up, 6 in Feb 1 04:56:48 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e155: 6 total, 6 up, 6 in Feb 1 04:56:48 localhost dnsmasq[312479]: exiting on receipt of SIGTERM Feb 1 04:56:48 localhost podman[315493]: 2026-02-01 09:56:48.894092035 +0000 UTC m=+0.065212137 container kill e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:56:48 localhost systemd[1]: libpod-e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8.scope: Deactivated successfully. Feb 1 04:56:48 localhost podman[315509]: 2026-02-01 09:56:48.964703647 +0000 UTC m=+0.051644380 container died e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:48 localhost systemd[1]: tmp-crun.jJ9uCP.mount: Deactivated successfully. Feb 1 04:56:49 localhost podman[315509]: 2026-02-01 09:56:49.002102967 +0000 UTC m=+0.089043670 container remove e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1e8b1bba-e8b9-4795-804e-ff4f5e0f095e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:49 localhost systemd[1]: libpod-conmon-e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8.scope: Deactivated successfully. Feb 1 04:56:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:49.568 259320 INFO neutron.agent.dhcp.agent [None req-6edd4f9e-5b95-4310-b1af-4003c8ba3831 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:49.569 259320 INFO neutron.agent.dhcp.agent [None req-6edd4f9e-5b95-4310-b1af-4003c8ba3831 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:49 localhost podman[315579]: Feb 1 04:56:49 localhost podman[315579]: 2026-02-01 09:56:49.78477552 +0000 UTC m=+0.080431965 container create f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:56:49 localhost systemd[1]: Started libpod-conmon-f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81.scope. Feb 1 04:56:49 localhost systemd[1]: Started libcrun container. Feb 1 04:56:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af8eb752bb87e14b9de406849b0397770f242fda603f048da7a2d5100e9630d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:49 localhost podman[315579]: 2026-02-01 09:56:49.842502065 +0000 UTC m=+0.138158540 container init f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:49 localhost podman[315579]: 2026-02-01 09:56:49.751586539 +0000 UTC m=+0.047243014 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:49 localhost podman[315579]: 2026-02-01 09:56:49.850835892 +0000 UTC m=+0.146492337 container start f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:49 localhost dnsmasq[315597]: started, version 2.85 cachesize 150 Feb 1 04:56:49 localhost dnsmasq[315597]: DNS service limited to local subnets Feb 1 04:56:49 localhost dnsmasq[315597]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:49 localhost dnsmasq[315597]: warning: no upstream servers configured Feb 1 04:56:49 localhost dnsmasq-dhcp[315597]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:56:49 localhost dnsmasq[315597]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:49 localhost dnsmasq-dhcp[315597]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:49 localhost dnsmasq-dhcp[315597]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:49 localhost systemd[1]: var-lib-containers-storage-overlay-14eeb768cd5f5ee5a15fbfa973cb5a7f11aa2d43bfb880403de94f716b9f01f8-merged.mount: Deactivated successfully. Feb 1 04:56:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5f1602deb062408897d91df2814a626c253b14a037b89fe444381af057c62a8-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:49 localhost systemd[1]: run-netns-qdhcp\x2d1e8b1bba\x2de8b9\x2d4795\x2d804e\x2dff4f5e0f095e.mount: Deactivated successfully. Feb 1 04:56:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:50.053 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:56:50 localhost nova_compute[274651]: 2026-02-01 09:56:50.102 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:50.211 259320 INFO neutron.agent.dhcp.agent [None req-422a2d05-d372-4ef0-8da1-789f277c6af1 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:50 localhost podman[315615]: 2026-02-01 09:56:50.233064149 +0000 UTC m=+0.058001906 container kill f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:50 localhost dnsmasq[315597]: exiting on receipt of SIGTERM Feb 1 04:56:50 localhost systemd[1]: libpod-f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81.scope: Deactivated successfully. Feb 1 04:56:50 localhost podman[315630]: 2026-02-01 09:56:50.297710357 +0000 UTC m=+0.050298138 container died f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:56:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:50 localhost podman[315630]: 2026-02-01 09:56:50.339856333 +0000 UTC m=+0.092444044 container remove f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:56:50 localhost systemd[1]: libpod-conmon-f5863a263b36437270013c80cce07a4714358137789212f53ab7bb2245b75f81.scope: Deactivated successfully. Feb 1 04:56:50 localhost ovn_controller[152492]: 2026-02-01T09:56:50Z|00271|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:56:50 localhost nova_compute[274651]: 2026-02-01 09:56:50.548 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:50 localhost systemd[1]: var-lib-containers-storage-overlay-af8eb752bb87e14b9de406849b0397770f242fda603f048da7a2d5100e9630d6-merged.mount: Deactivated successfully. Feb 1 04:56:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:51.099 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:51.102 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:56:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:51.105 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2c8acb4c-81ce-46a5-8f7d-cb894b49c6bc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:56:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:51.105 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:51.106 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d93a7179-162f-4c55-8f56-d206711d6b27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:51 localhost nova_compute[274651]: 2026-02-01 09:56:51.297 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e155 do_prune osdmap full prune enabled Feb 1 04:56:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e156 e156: 6 total, 6 up, 6 in Feb 1 04:56:51 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e156: 6 total, 6 up, 6 in Feb 1 04:56:51 localhost podman[315703]: Feb 1 04:56:51 localhost podman[315703]: 2026-02-01 09:56:51.809317219 +0000 UTC m=+0.075223905 container create a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:56:51 localhost systemd[1]: Started libpod-conmon-a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d.scope. Feb 1 04:56:51 localhost systemd[1]: tmp-crun.HI7HQr.mount: Deactivated successfully. Feb 1 04:56:51 localhost systemd[1]: Started libcrun container. Feb 1 04:56:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a8133601d1cc91d96d4027c60af6916855363cd577dda59f3a222e655f22fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:51 localhost podman[315703]: 2026-02-01 09:56:51.874588067 +0000 UTC m=+0.140494753 container init a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:51 localhost podman[315703]: 2026-02-01 09:56:51.779778821 +0000 UTC m=+0.045685487 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:51 localhost podman[315703]: 2026-02-01 09:56:51.883159461 +0000 UTC m=+0.149066157 container start a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:51 localhost dnsmasq[315721]: started, version 2.85 cachesize 150 Feb 1 04:56:51 localhost dnsmasq[315721]: DNS service limited to local subnets Feb 1 04:56:51 localhost dnsmasq[315721]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:51 localhost dnsmasq[315721]: warning: no upstream servers configured Feb 1 04:56:51 localhost dnsmasq-dhcp[315721]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:56:51 localhost dnsmasq[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:51 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:51 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:52 localhost nova_compute[274651]: 2026-02-01 09:56:52.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:52 localhost nova_compute[274651]: 2026-02-01 09:56:52.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:56:52 localhost nova_compute[274651]: 2026-02-01 09:56:52.273 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:56:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:52.350 259320 INFO neutron.agent.dhcp.agent [None req-cf97f498-3d3a-4ecb-8f6c-c8093c82f88a - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:52 localhost dnsmasq[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:52 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:52 localhost podman[315739]: 2026-02-01 09:56:52.428883505 +0000 UTC m=+0.059929074 container kill a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:52 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:52 localhost nova_compute[274651]: 2026-02-01 09:56:52.644 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:56:52 localhost nova_compute[274651]: 2026-02-01 09:56:52.644 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:56:52 localhost nova_compute[274651]: 2026-02-01 09:56:52.644 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:56:52 localhost nova_compute[274651]: 2026-02-01 09:56:52.645 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:56:52 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:52.731 2 INFO neutron.agent.securitygroups_rpc [None req-28352136-f461-4efb-990d-d0ac566ee992 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:56:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:52.836 259320 INFO neutron.agent.dhcp.agent [None req-6a898be1-7e65-4698-86ae-511d129d03cd - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:52.996 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:52Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=2ce76f41-0ea6-4ad4-a265-73b871c9a606, ip_allocation=immediate, mac_address=fa:16:3e:9f:ba:1e, name=tempest-NetworksTestDHCPv6-2118048400, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['5e930a34-f7bd-4b36-8b88-b06914583593', '97405e9f-2c3a-4006-bf03-a402f912e9f9'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:49Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2069, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:56:52Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:56:53 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:53.049 259320 INFO neutron.agent.linux.ip_lib [None req-dae518c4-ea19-4051-813f-56ad5634082e - - - - - -] Device tap77af7c8b-62 cannot be used as it has no MAC address#033[00m Feb 1 04:56:53 localhost nova_compute[274651]: 2026-02-01 09:56:53.086 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:53 localhost kernel: device tap77af7c8b-62 entered promiscuous mode Feb 1 04:56:53 localhost NetworkManager[5964]: [1769939813.0963] manager: (tap77af7c8b-62): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Feb 1 04:56:53 localhost nova_compute[274651]: 2026-02-01 09:56:53.097 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:53 localhost ovn_controller[152492]: 2026-02-01T09:56:53Z|00272|binding|INFO|Claiming lport 77af7c8b-62eb-4cdb-8b63-33a866118616 for this chassis. Feb 1 04:56:53 localhost ovn_controller[152492]: 2026-02-01T09:56:53Z|00273|binding|INFO|77af7c8b-62eb-4cdb-8b63-33a866118616: Claiming unknown Feb 1 04:56:53 localhost systemd-udevd[315778]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:56:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:53.109 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-db1840b6-7d8b-4013-902e-08542d02ed01', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db1840b6-7d8b-4013-902e-08542d02ed01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55f65e86-8540-4a00-a594-b0afd7111cc6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=77af7c8b-62eb-4cdb-8b63-33a866118616) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:53.112 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 77af7c8b-62eb-4cdb-8b63-33a866118616 in datapath db1840b6-7d8b-4013-902e-08542d02ed01 bound to our chassis#033[00m Feb 1 04:56:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:53.113 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network db1840b6-7d8b-4013-902e-08542d02ed01 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:56:53 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:53.114 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[a869ce37-6f9b-42e1-9977-3e491bd4f066]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost ovn_controller[152492]: 2026-02-01T09:56:53Z|00274|binding|INFO|Setting lport 77af7c8b-62eb-4cdb-8b63-33a866118616 ovn-installed in OVS Feb 1 04:56:53 localhost ovn_controller[152492]: 2026-02-01T09:56:53Z|00275|binding|INFO|Setting lport 77af7c8b-62eb-4cdb-8b63-33a866118616 up in Southbound Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost nova_compute[274651]: 2026-02-01 09:56:53.135 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost journal[217584]: ethtool ioctl error on tap77af7c8b-62: No such device Feb 1 04:56:53 localhost nova_compute[274651]: 2026-02-01 09:56:53.185 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:53 localhost nova_compute[274651]: 2026-02-01 09:56:53.213 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:53 localhost systemd[1]: tmp-crun.SlEuqW.mount: Deactivated successfully. Feb 1 04:56:53 localhost dnsmasq[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:56:53 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:53 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:53 localhost podman[315811]: 2026-02-01 09:56:53.26589973 +0000 UTC m=+0.064126474 container kill a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:56:53 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:53.591 259320 INFO neutron.agent.dhcp.agent [None req-dd448236-fe91-462a-b696-cd37b2da1da3 - - - - - -] DHCP configuration for ports {'2ce76f41-0ea6-4ad4-a265-73b871c9a606'} is completed#033[00m Feb 1 04:56:53 localhost podman[236886]: time="2026-02-01T09:56:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:56:53 localhost podman[236886]: @ - - [01/Feb/2026:09:56:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160178 "" "Go-http-client/1.1" Feb 1 04:56:54 localhost podman[236886]: @ - - [01/Feb/2026:09:56:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19766 "" "Go-http-client/1.1" Feb 1 04:56:54 localhost podman[315880]: Feb 1 04:56:54 localhost podman[315880]: 2026-02-01 09:56:54.153529011 +0000 UTC m=+0.086402019 container create 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:56:54 localhost systemd[1]: Started libpod-conmon-3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f.scope. Feb 1 04:56:54 localhost systemd[1]: tmp-crun.FJOSqm.mount: Deactivated successfully. Feb 1 04:56:54 localhost podman[315880]: 2026-02-01 09:56:54.111929861 +0000 UTC m=+0.044802909 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:54 localhost systemd[1]: Started libcrun container. Feb 1 04:56:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51549f37e92e24acbb890150e6a7fb318192987c43c49ef54d88f7e0d48dfcd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:54 localhost podman[315880]: 2026-02-01 09:56:54.236671118 +0000 UTC m=+0.169544166 container init 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:56:54 localhost dnsmasq[315910]: started, version 2.85 cachesize 150 Feb 1 04:56:54 localhost dnsmasq[315910]: DNS service limited to local subnets Feb 1 04:56:54 localhost dnsmasq[315910]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:54 localhost dnsmasq[315910]: warning: no upstream servers configured Feb 1 04:56:54 localhost dnsmasq-dhcp[315910]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:56:54 localhost dnsmasq[315910]: read /var/lib/neutron/dhcp/db1840b6-7d8b-4013-902e-08542d02ed01/addn_hosts - 0 addresses Feb 1 04:56:54 localhost dnsmasq-dhcp[315910]: read /var/lib/neutron/dhcp/db1840b6-7d8b-4013-902e-08542d02ed01/host Feb 1 04:56:54 localhost dnsmasq-dhcp[315910]: read /var/lib/neutron/dhcp/db1840b6-7d8b-4013-902e-08542d02ed01/opts Feb 1 04:56:54 localhost podman[315894]: 2026-02-01 09:56:54.275672608 +0000 UTC m=+0.083469219 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:56:54 localhost nova_compute[274651]: 2026-02-01 09:56:54.295 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:56:54 localhost podman[315880]: 2026-02-01 09:56:54.29722206 +0000 UTC m=+0.230095068 container start 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:56:54 localhost podman[315894]: 2026-02-01 09:56:54.311719307 +0000 UTC m=+0.119515958 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 1 04:56:54 localhost nova_compute[274651]: 2026-02-01 09:56:54.316 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:56:54 localhost nova_compute[274651]: 2026-02-01 09:56:54.317 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:56:54 localhost nova_compute[274651]: 2026-02-01 09:56:54.317 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:54 localhost nova_compute[274651]: 2026-02-01 09:56:54.318 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:54 localhost nova_compute[274651]: 2026-02-01 09:56:54.318 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:56:54 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:56:54 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:54.384 2 INFO neutron.agent.securitygroups_rpc [None req-beca9d48-bb68-446a-9132-2fee37d11230 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:56:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:54.513 259320 INFO neutron.agent.dhcp.agent [None req-7e6387d1-8e86-4112-a870-4ba90e65044d - - - - - -] DHCP configuration for ports {'574c9c28-e864-41dc-b05a-e133dc730c4f'} is completed#033[00m Feb 1 04:56:54 localhost dnsmasq[315910]: read /var/lib/neutron/dhcp/db1840b6-7d8b-4013-902e-08542d02ed01/addn_hosts - 0 addresses Feb 1 04:56:54 localhost dnsmasq-dhcp[315910]: read /var/lib/neutron/dhcp/db1840b6-7d8b-4013-902e-08542d02ed01/host Feb 1 04:56:54 localhost dnsmasq-dhcp[315910]: read /var/lib/neutron/dhcp/db1840b6-7d8b-4013-902e-08542d02ed01/opts Feb 1 04:56:54 localhost podman[315950]: 2026-02-01 09:56:54.70490341 +0000 UTC m=+0.068302212 container kill 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:56:54 localhost dnsmasq[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:54 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:56:54 localhost podman[315963]: 2026-02-01 09:56:54.763182562 +0000 UTC m=+0.065803235 container kill a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:56:54 localhost dnsmasq-dhcp[315721]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:56:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:54.997 259320 INFO neutron.agent.dhcp.agent [None req-6e42f5cc-1769-4af6-88f7-d88b9e1f92d1 - - - - - -] DHCP configuration for ports {'574c9c28-e864-41dc-b05a-e133dc730c4f', '77af7c8b-62eb-4cdb-8b63-33a866118616'} is completed#033[00m Feb 1 04:56:55 localhost nova_compute[274651]: 2026-02-01 09:56:55.128 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:55 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:55.249 2 INFO neutron.agent.securitygroups_rpc [None req-4f603dff-697a-4023-bd41-ff0e5bb72114 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']#033[00m Feb 1 04:56:55 localhost nova_compute[274651]: 2026-02-01 09:56:55.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:55 localhost nova_compute[274651]: 2026-02-01 09:56:55.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:55 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:55.401 2 INFO neutron.agent.securitygroups_rpc [None req-4f603dff-697a-4023-bd41-ff0e5bb72114 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']#033[00m Feb 1 04:56:55 localhost dnsmasq[315721]: exiting on receipt of SIGTERM Feb 1 04:56:55 localhost systemd[1]: libpod-a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d.scope: Deactivated successfully. Feb 1 04:56:55 localhost podman[316014]: 2026-02-01 09:56:55.614226388 +0000 UTC m=+0.061242694 container kill a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:56:55 localhost podman[316028]: 2026-02-01 09:56:55.677284387 +0000 UTC m=+0.052839906 container died a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:55 localhost podman[316028]: 2026-02-01 09:56:55.710889411 +0000 UTC m=+0.086444920 container cleanup a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:56:55 localhost systemd[1]: libpod-conmon-a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d.scope: Deactivated successfully. Feb 1 04:56:55 localhost podman[316031]: 2026-02-01 09:56:55.755355259 +0000 UTC m=+0.119520818 container remove a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:56:56 localhost systemd[1]: var-lib-containers-storage-overlay-d3a8133601d1cc91d96d4027c60af6916855363cd577dda59f3a222e655f22fa-merged.mount: Deactivated successfully. Feb 1 04:56:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a66dd15b47b5e86a1961927cb1ccd40fb935fb1b45b22da3c6d9b8d777ed362d-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:56 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:56.276 2 INFO neutron.agent.securitygroups_rpc [None req-8fcd2e7b-6ccb-4a1d-8200-e9004b8005a0 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']#033[00m Feb 1 04:56:56 localhost ovn_controller[152492]: 2026-02-01T09:56:56Z|00276|binding|INFO|Removing iface tap77af7c8b-62 ovn-installed in OVS Feb 1 04:56:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:56.327 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 144e36f5-3ead-47c5-a34d-620e8dacb725 with type ""#033[00m Feb 1 04:56:56 localhost ovn_controller[152492]: 2026-02-01T09:56:56Z|00277|binding|INFO|Removing lport 77af7c8b-62eb-4cdb-8b63-33a866118616 ovn-installed in OVS Feb 1 04:56:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:56.329 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-db1840b6-7d8b-4013-902e-08542d02ed01', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db1840b6-7d8b-4013-902e-08542d02ed01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=55f65e86-8540-4a00-a594-b0afd7111cc6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=77af7c8b-62eb-4cdb-8b63-33a866118616) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:56 localhost nova_compute[274651]: 2026-02-01 09:56:56.365 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:56.367 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 77af7c8b-62eb-4cdb-8b63-33a866118616 in datapath db1840b6-7d8b-4013-902e-08542d02ed01 unbound from our chassis#033[00m Feb 1 04:56:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:56.371 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db1840b6-7d8b-4013-902e-08542d02ed01, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:56 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:56.372 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[fe69a2fd-12f8-4d1b-93bd-4ac51fad8dd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:56 localhost dnsmasq[315910]: exiting on receipt of SIGTERM Feb 1 04:56:56 localhost podman[316114]: 2026-02-01 09:56:56.499334121 +0000 UTC m=+0.053803266 container kill 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:56:56 localhost systemd[1]: libpod-3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f.scope: Deactivated successfully. Feb 1 04:56:56 localhost podman[316134]: Feb 1 04:56:56 localhost podman[316141]: 2026-02-01 09:56:56.564221158 +0000 UTC m=+0.056436648 container died 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:56 localhost podman[316134]: 2026-02-01 09:56:56.618741324 +0000 UTC m=+0.123674215 container create 670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:56:56 localhost podman[316134]: 2026-02-01 09:56:56.529364305 +0000 UTC m=+0.034297176 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:56:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e156 do_prune osdmap full prune enabled Feb 1 04:56:56 localhost systemd[1]: Started libpod-conmon-670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96.scope. Feb 1 04:56:56 localhost systemd[1]: Started libcrun container. Feb 1 04:56:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e6c91b917171d705564f87d607b926417e2e12df9466093bf70071fc7760501/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:56 localhost podman[316134]: 2026-02-01 09:56:56.684202017 +0000 UTC m=+0.189134948 container init 670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true) Feb 1 04:56:56 localhost podman[316134]: 2026-02-01 09:56:56.695712072 +0000 UTC m=+0.200644963 container start 670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true) Feb 1 04:56:56 localhost dnsmasq[316179]: started, version 2.85 cachesize 150 Feb 1 04:56:56 localhost dnsmasq[316179]: DNS service limited to local subnets Feb 1 04:56:56 localhost dnsmasq[316179]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:56:56 localhost dnsmasq[316179]: warning: no upstream servers configured Feb 1 04:56:56 localhost dnsmasq[316179]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:56:56 localhost podman[316141]: 2026-02-01 09:56:56.70183952 +0000 UTC m=+0.194055000 container cleanup 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:56:56 localhost systemd[1]: libpod-conmon-3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f.scope: Deactivated successfully. Feb 1 04:56:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e157 e157: 6 total, 6 up, 6 in Feb 1 04:56:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e157: 6 total, 6 up, 6 in Feb 1 04:56:56 localhost podman[316152]: 2026-02-01 09:56:56.730420419 +0000 UTC m=+0.209727151 container remove 3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db1840b6-7d8b-4013-902e-08542d02ed01, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:56:56 localhost nova_compute[274651]: 2026-02-01 09:56:56.744 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:56 localhost kernel: device tap77af7c8b-62 left promiscuous mode Feb 1 04:56:56 localhost nova_compute[274651]: 2026-02-01 09:56:56.757 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:56.954 259320 INFO neutron.agent.dhcp.agent [None req-c774440c-d7cc-43d7-be46-53b11cf3e620 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', 'fcb21783-999a-439a-8247-7e417db55a21'} is completed#033[00m Feb 1 04:56:57 localhost dnsmasq[316179]: exiting on receipt of SIGTERM Feb 1 04:56:57 localhost podman[316197]: 2026-02-01 09:56:57.021186822 +0000 UTC m=+0.061750050 container kill 670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:56:57 localhost systemd[1]: libpod-670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96.scope: Deactivated successfully. Feb 1 04:56:57 localhost podman[316212]: 2026-02-01 09:56:57.102980578 +0000 UTC m=+0.064168735 container died 670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:57 localhost podman[316212]: 2026-02-01 09:56:57.128203664 +0000 UTC m=+0.089391801 container cleanup 670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:56:57 localhost systemd[1]: libpod-conmon-670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96.scope: Deactivated successfully. Feb 1 04:56:57 localhost systemd[1]: var-lib-containers-storage-overlay-6e6c91b917171d705564f87d607b926417e2e12df9466093bf70071fc7760501-merged.mount: Deactivated successfully. Feb 1 04:56:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:57 localhost systemd[1]: var-lib-containers-storage-overlay-51549f37e92e24acbb890150e6a7fb318192987c43c49ef54d88f7e0d48dfcd3-merged.mount: Deactivated successfully. Feb 1 04:56:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3336d8b57e33d7b03504d498cf2f0bea11ac2b20f5a9263b31243f5688e10f2f-userdata-shm.mount: Deactivated successfully. Feb 1 04:56:57 localhost systemd[1]: run-netns-qdhcp\x2ddb1840b6\x2d7d8b\x2d4013\x2d902e\x2d08542d02ed01.mount: Deactivated successfully. Feb 1 04:56:57 localhost podman[316213]: 2026-02-01 09:56:57.175399455 +0000 UTC m=+0.130819204 container remove 670921b41c3a9ea96b0a3709a6f00761635402ab5c0631f8a7f536b4df39ed96 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:56:57 localhost ovn_controller[152492]: 2026-02-01T09:56:57Z|00278|binding|INFO|Releasing lport fcb21783-999a-439a-8247-7e417db55a21 from this chassis (sb_readonly=0) Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.187 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:57 localhost ovn_controller[152492]: 2026-02-01T09:56:57Z|00279|binding|INFO|Setting lport fcb21783-999a-439a-8247-7e417db55a21 down in Southbound Feb 1 04:56:57 localhost kernel: device tapfcb21783-99 left promiscuous mode Feb 1 04:56:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:57.202 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fef1:c1db/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '12', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fcb21783-999a-439a-8247-7e417db55a21) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:57.204 158365 INFO neutron.agent.ovn.metadata.agent [-] Port fcb21783-999a-439a-8247-7e417db55a21 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:56:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:57.207 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:57.208 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d828b4bb-be4f-442d-b06d-678fca1ce289]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.214 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:57 localhost ovn_controller[152492]: 2026-02-01T09:56:57Z|00280|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.272 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.274 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.301 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.301 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.302 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.302 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.303 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:56:57 localhost neutron_sriov_agent[252126]: 2026-02-01 09:56:57.422 2 INFO neutron.agent.securitygroups_rpc [None req-1f1c195c-f9d6-4ec8-8caa-a0e049b01499 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']#033[00m Feb 1 04:56:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:57.508 259320 INFO neutron.agent.dhcp.agent [None req-95135ad7-31fc-43b3-80f7-a4e33f52d8f2 - - - - - -] Synchronizing state#033[00m Feb 1 04:56:57 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:56:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:57.753 259320 INFO neutron.agent.dhcp.agent [None req-4f285f19-d81f-4177-8326-d5fe199bd2a6 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:56:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:57.768 259320 INFO neutron.agent.dhcp.agent [-] Starting network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:56:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:56:57 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2735579090' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.802 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.869 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:56:57 localhost nova_compute[274651]: 2026-02-01 09:56:57.869 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.030 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.031 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11294MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.031 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.031 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.148 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.148 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.148 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.214 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.449 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:56:58 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2392180758' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.668 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.675 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.704 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.736 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.737 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:56:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:56:58.875 259320 INFO neutron.agent.linux.ip_lib [None req-9b893808-9b7b-4fc6-969c-eccf3feaba6a - - - - - -] Device tap2f67d5f3-95 cannot be used as it has no MAC address#033[00m Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.898 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:58 localhost kernel: device tap2f67d5f3-95 entered promiscuous mode Feb 1 04:56:58 localhost NetworkManager[5964]: [1769939818.9041] manager: (tap2f67d5f3-95): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.905 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:58 localhost systemd-udevd[316293]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.910 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.939 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost journal[217584]: ethtool ioctl error on tap2f67d5f3-95: No such device Feb 1 04:56:58 localhost nova_compute[274651]: 2026-02-01 09:56:58.978 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:59 localhost nova_compute[274651]: 2026-02-01 09:56:59.001 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:59 localhost nova_compute[274651]: 2026-02-01 09:56:59.191 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:56:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:59.334 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:56:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:59.335 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:56:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:59.338 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:56:59 localhost ovn_metadata_agent[158360]: 2026-02-01 09:56:59.339 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d53fc46b-6528-41df-90b6-285ea8010659]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:56:59 localhost podman[316365]: Feb 1 04:56:59 localhost podman[316365]: 2026-02-01 09:56:59.916739341 +0000 UTC m=+0.075239736 container create a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:56:59 localhost systemd[1]: Started libpod-conmon-a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c.scope. Feb 1 04:56:59 localhost systemd[1]: tmp-crun.oYwXIB.mount: Deactivated successfully. Feb 1 04:56:59 localhost systemd[1]: Started libcrun container. Feb 1 04:56:59 localhost podman[316365]: 2026-02-01 09:56:59.886529332 +0000 UTC m=+0.045029727 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:56:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be713bba55afd08ccb83b436f5d0c7bde0410a05fa7322ce635ba5cd4c4cf3e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:56:59 localhost podman[316365]: 2026-02-01 09:56:59.995598537 +0000 UTC m=+0.154098902 container init a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:00 localhost podman[316365]: 2026-02-01 09:57:00.002441887 +0000 UTC m=+0.160942272 container start a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:00 localhost dnsmasq[316383]: started, version 2.85 cachesize 150 Feb 1 04:57:00 localhost dnsmasq[316383]: DNS service limited to local subnets Feb 1 04:57:00 localhost dnsmasq[316383]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:00 localhost dnsmasq[316383]: warning: no upstream servers configured Feb 1 04:57:00 localhost dnsmasq-dhcp[316383]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:57:00 localhost dnsmasq[316383]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:00 localhost dnsmasq-dhcp[316383]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:00 localhost dnsmasq-dhcp[316383]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.081 259320 INFO neutron.agent.dhcp.agent [None req-6b3217a6-6d4d-46f0-9dcb-9d98f76dbfc6 - - - - - -] Finished network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.082 259320 INFO neutron.agent.dhcp.agent [None req-6969c447-4f7c-4cd5-80d0-3aa9e9bbb992 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.084 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:00 localhost nova_compute[274651]: 2026-02-01 09:57:00.144 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:00 localhost kernel: device tap2f67d5f3-95 left promiscuous mode Feb 1 04:57:00 localhost nova_compute[274651]: 2026-02-01 09:57:00.162 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.237 259320 INFO neutron.agent.dhcp.agent [None req-74e27dc0-027a-412b-b91a-3dd146d7091c - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:00 localhost dnsmasq[316383]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:00 localhost dnsmasq-dhcp[316383]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:00 localhost dnsmasq-dhcp[316383]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:00 localhost podman[316403]: 2026-02-01 09:57:00.55034842 +0000 UTC m=+0.059575114 container kill a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent [None req-63be9693-2eb5-40c6-989f-c4bb14c960d5 - - - - - -] Unable to reload_allocations dhcp for cba39058-6a05-4f77-add1-57334b728a66.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap2f67d5f3-95 not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap2f67d5f3-95 not found in namespace qdhcp-cba39058-6a05-4f77-add1-57334b728a66. Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.572 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.576 259320 INFO neutron.agent.dhcp.agent [None req-6969c447-4f7c-4cd5-80d0-3aa9e9bbb992 - - - - - -] Synchronizing state#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.713 259320 INFO neutron.agent.dhcp.agent [None req-60de6401-3019-439d-885c-f97cd0e0b8d5 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', '2f67d5f3-95dd-4865-838d-bf7ebbc38c8a'} is completed#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.779 259320 INFO neutron.agent.dhcp.agent [None req-53878570-0935-4a4b-96ac-356b649e2207 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.780 259320 INFO neutron.agent.dhcp.agent [-] Starting network 8016a484-f830-421e-ae22-a065d97b13c5 dhcp configuration#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.781 259320 INFO neutron.agent.dhcp.agent [-] Finished network 8016a484-f830-421e-ae22-a065d97b13c5 dhcp configuration#033[00m Feb 1 04:57:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:00.781 259320 INFO neutron.agent.dhcp.agent [-] Starting network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:57:00 localhost dnsmasq[316383]: exiting on receipt of SIGTERM Feb 1 04:57:00 localhost systemd[1]: libpod-a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c.scope: Deactivated successfully. Feb 1 04:57:00 localhost podman[316434]: 2026-02-01 09:57:00.959414091 +0000 UTC m=+0.065929319 container kill a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:01 localhost podman[316447]: 2026-02-01 09:57:01.035330986 +0000 UTC m=+0.065676471 container died a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:57:01 localhost systemd[1]: tmp-crun.4ijXs8.mount: Deactivated successfully. Feb 1 04:57:01 localhost podman[316447]: 2026-02-01 09:57:01.065933788 +0000 UTC m=+0.096279213 container cleanup a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:01 localhost systemd[1]: libpod-conmon-a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c.scope: Deactivated successfully. Feb 1 04:57:01 localhost podman[316454]: 2026-02-01 09:57:01.10926944 +0000 UTC m=+0.124969175 container remove a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:57:01 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:01.159 259320 INFO neutron.agent.linux.ip_lib [-] Device tap2f67d5f3-95 cannot be used as it has no MAC address#033[00m Feb 1 04:57:01 localhost nova_compute[274651]: 2026-02-01 09:57:01.227 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:01 localhost kernel: device tap2f67d5f3-95 entered promiscuous mode Feb 1 04:57:01 localhost NetworkManager[5964]: [1769939821.2345] manager: (tap2f67d5f3-95): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Feb 1 04:57:01 localhost systemd-udevd[316295]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:01 localhost nova_compute[274651]: 2026-02-01 09:57:01.236 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:01 localhost nova_compute[274651]: 2026-02-01 09:57:01.245 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:01 localhost nova_compute[274651]: 2026-02-01 09:57:01.267 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:01 localhost nova_compute[274651]: 2026-02-01 09:57:01.303 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:01 localhost nova_compute[274651]: 2026-02-01 09:57:01.331 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:01 localhost openstack_network_exporter[239441]: ERROR 09:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:57:01 localhost openstack_network_exporter[239441]: Feb 1 04:57:01 localhost openstack_network_exporter[239441]: ERROR 09:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:57:01 localhost openstack_network_exporter[239441]: Feb 1 04:57:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:57:01 localhost nova_compute[274651]: 2026-02-01 09:57:01.732 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:57:01 localhost systemd[1]: var-lib-containers-storage-overlay-2be713bba55afd08ccb83b436f5d0c7bde0410a05fa7322ce635ba5cd4c4cf3e-merged.mount: Deactivated successfully. Feb 1 04:57:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2a7812c150070534a00c6bc3cee5a4ef39834ffaa5cd589a58cf327d700202c-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:01 localhost podman[316511]: 2026-02-01 09:57:01.976522445 +0000 UTC m=+0.076832535 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:57:01 localhost podman[316511]: 2026-02-01 09:57:01.987337376 +0000 UTC m=+0.087647496 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:57:01 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:57:02 localhost podman[316555]: Feb 1 04:57:02 localhost podman[316555]: 2026-02-01 09:57:02.188295688 +0000 UTC m=+0.089171744 container create 3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:57:02 localhost systemd[1]: Started libpod-conmon-3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21.scope. Feb 1 04:57:02 localhost systemd[1]: Started libcrun container. Feb 1 04:57:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1811bfc5729af1a7bb6c757b70c889f29bcd905d6c38663ae238820a3ad313be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:02 localhost podman[316555]: 2026-02-01 09:57:02.144348257 +0000 UTC m=+0.045224373 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:02 localhost podman[316555]: 2026-02-01 09:57:02.25729264 +0000 UTC m=+0.158168696 container init 3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:57:02 localhost podman[316555]: 2026-02-01 09:57:02.266226335 +0000 UTC m=+0.167102401 container start 3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:02 localhost dnsmasq[316573]: started, version 2.85 cachesize 150 Feb 1 04:57:02 localhost dnsmasq[316573]: DNS service limited to local subnets Feb 1 04:57:02 localhost dnsmasq[316573]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:02 localhost dnsmasq[316573]: warning: no upstream servers configured Feb 1 04:57:02 localhost dnsmasq-dhcp[316573]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:57:02 localhost dnsmasq[316573]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:02 localhost dnsmasq-dhcp[316573]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:02 localhost dnsmasq-dhcp[316573]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:02.328 259320 INFO neutron.agent.dhcp.agent [None req-f0d6e29d-e659-4071-ac29-f6b0bc7c3546 - - - - - -] Finished network cba39058-6a05-4f77-add1-57334b728a66 dhcp configuration#033[00m Feb 1 04:57:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:02.332 259320 INFO neutron.agent.dhcp.agent [None req-53878570-0935-4a4b-96ac-356b649e2207 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:57:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:02.552 259320 INFO neutron.agent.dhcp.agent [None req-33dc8532-d401-41df-9d18-306e4902b261 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', '2f67d5f3-95dd-4865-838d-bf7ebbc38c8a'} is completed#033[00m Feb 1 04:57:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:02.702 259320 INFO neutron.agent.dhcp.agent [None req-90ad7efd-eb0b-41b5-a191-8b2f850f9487 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', '2f67d5f3-95dd-4865-838d-bf7ebbc38c8a'} is completed#033[00m Feb 1 04:57:03 localhost ovn_controller[152492]: 2026-02-01T09:57:03Z|00281|binding|INFO|Claiming lport 2f67d5f3-95dd-4865-838d-bf7ebbc38c8a for this chassis. Feb 1 04:57:03 localhost ovn_controller[152492]: 2026-02-01T09:57:03Z|00282|binding|INFO|2f67d5f3-95dd-4865-838d-bf7ebbc38c8a: Claiming unknown Feb 1 04:57:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:03.140 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fedc:fd9f/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2f67d5f3-95dd-4865-838d-bf7ebbc38c8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:03.142 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 2f67d5f3-95dd-4865-838d-bf7ebbc38c8a in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:57:03 localhost ovn_controller[152492]: 2026-02-01T09:57:03Z|00283|binding|INFO|Setting lport 2f67d5f3-95dd-4865-838d-bf7ebbc38c8a ovn-installed in OVS Feb 1 04:57:03 localhost ovn_controller[152492]: 2026-02-01T09:57:03Z|00284|binding|INFO|Setting lport 2f67d5f3-95dd-4865-838d-bf7ebbc38c8a up in Southbound Feb 1 04:57:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:03.146 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1d165f4e-0c12-4039-b954-b99cd6d753f9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:03.147 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:03 localhost nova_compute[274651]: 2026-02-01 09:57:03.147 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:03.148 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[07323805-087d-4d08-9b44-8cb394e27f25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.529 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.530 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.534 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a51a121-1fad-45c3-82a7-88655327e660', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.530597', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c0fbd54-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': '86708876ae15cd455b762b5b51a821c848d1ca3b7ba779183e76600a7c10ebb3'}]}, 'timestamp': '2026-02-01 09:57:03.535979', '_unique_id': 'b294132b1c224150bb44c8de0057d873'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.537 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.539 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.539 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e635662d-ae05-49f5-a57a-67da96ee04bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.539586', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c1063ee-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': '1eadef5dd345db60fc9decbbbf985bc0ec8e4ee5190b55ca8335833299601b2f'}]}, 'timestamp': '2026-02-01 09:57:03.540313', '_unique_id': '994bd4b5292644c3b56b9509f53036e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.541 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.542 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f3db651-1b2e-4d44-acc3-63bd501e5357', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.542487', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c10cece-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': 'd7604ede4eced8697ec7513cf0b30a5eb66924ff326fbd6409ea617dbb20dc36'}]}, 'timestamp': '2026-02-01 09:57:03.542818', '_unique_id': '5955f74560284aa09682734a25375840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.543 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.544 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.576 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.577 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bacd0d8-46a7-48e6-b85b-e22acb6e2a21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.544355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c16135c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '66951d4fa0262b1aed44abe672ee251cf8650b5597b8cf6e995680ba5ee28e31'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.544355', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c1626ee-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '34e9217a1d1b76cad3b18716316bb8ea246e0482b83b9ce32413df059ae6e121'}]}, 'timestamp': '2026-02-01 09:57:03.577905', '_unique_id': '135daf64694b4435918694c4ed123360'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.580 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e524e3d7-1319-4fa9-9316-e49fa09182c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.580735', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c16a966-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': 'fd93a2ed80b20f3ab3cb8ce0e18cc83613b9a1db63f5f5e18cd746ac441f7e16'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.580735', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c16baaa-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': 'c45d18f972007d0b118cdca02fbbb4d70ddf60e8540d881b47490e73c13efce1'}]}, 'timestamp': '2026-02-01 09:57:03.581670', '_unique_id': '3c1bd2cc0b22485e917d4abaf8c5e792'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.596 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.597 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dd5224d-0f95-4890-965b-150c3afef71c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.584261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c192182-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.803728884, 'message_signature': '90809ac1b5be3526992a01e7c670ed0badb2a8b8ed01a08fe74181fb608b45f7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.584261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c193a1e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.803728884, 'message_signature': 'f939afb0fb72ae4363dbe8357303526d461ab5be3f6f78765965fb66022f452e'}]}, 'timestamp': '2026-02-01 09:57:03.598167', '_unique_id': 'f9f5d64d2bc7433b84693ce4f500124f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.600 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.601 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.601 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b56f4044-390f-4f22-9b6f-4b0ab3e59955', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.600954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c19c4ac-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '8020a8ef1e7c54b0a9db05e5b908515f850864c8513d915cef10759a2289bf7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.600954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c19e248-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '930510f9524518340f026c0be79d3f4425339cde44807bdce844af7855c2fb1f'}]}, 'timestamp': '2026-02-01 09:57:03.602482', '_unique_id': 'aeff167a1c534bd1b4fd0cd19fd45fef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.606 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5232e80f-729c-45e4-b803-a4065c449064', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.606550', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c1a9e36-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': 'b35324375bb4ed22877e43a42c532b18e9069a3a46496fad521bb577e29aab73'}]}, 'timestamp': '2026-02-01 09:57:03.607385', '_unique_id': 'c876357155884afba50e835200fa7e65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.610 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.611 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83578324-1dc8-478a-bb87-164389107d99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.611044', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c1b4dfe-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': '2b23b69392d33c393703d9453f407a389ead4ac4965a4b82098b4fb5d50753f2'}]}, 'timestamp': '2026-02-01 09:57:03.611838', '_unique_id': '648fe00589f941abb8976d6b15ccf288'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.613 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.615 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.615 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.615 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.616 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4be4685-253e-46c2-892d-d17e189da133', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.615705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c1c0578-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '893010c072adbe544e55b1cad8c7b1a8892a4e7d0db777f529cb4ca8d8239216'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.615705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c1c213e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '5adaa91c6f4f20735ddd618c5f96e10fa33bcf7742a3162bfad85b63a520c942'}]}, 'timestamp': '2026-02-01 09:57:03.617267', '_unique_id': 'cd33f1693f1b46bf97a7de777e799999'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.618 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f10c332-f10b-4b29-b55d-a6e1ff0f31bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.621119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c1cd782-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.803728884, 'message_signature': 'b73b7cc44447b168f19cd0fea9fb7642d1f5527bff6b82346769adc4c0ccea2c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.621119', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c1cf348-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.803728884, 'message_signature': '7ba4e613d2527784616471138aca55a38da56042fe1fcd7baa4f56ae40eb980b'}]}, 'timestamp': '2026-02-01 09:57:03.622552', '_unique_id': 'd98af7b89a154e8ab00b7cc57233c0f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.624 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.626 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1f6e5b5-945f-4b58-9615-8c61083f1c94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.626701', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c1daf86-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': '61413677ef24185989ca6fd5dd79fe73700dded60a2a6f7011acbe95782df725'}]}, 'timestamp': '2026-02-01 09:57:03.627408', '_unique_id': 'ad0b6eab1842481ab6c5ecc6c0f05d02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.628 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae20ef2f-7d8b-4be7-b407-a19e425522c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.630120', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c1e3208-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '54dafd996260a51f1d3e7efb2b4c0c5dc02f279acdbe875a9b3dc2fdf8fff280'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.630120', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c1e45e0-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': 'a3ad97047c546d2305db7d770c147492c7f693d8a2fd6b91b486de6e96170bb8'}]}, 'timestamp': '2026-02-01 09:57:03.631166', '_unique_id': 'b97a9e45c4d54cafa035eb1b088b1bc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.633 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.645 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afeaa11d-a7db-40d0-9107-3550d03c9112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:57:03.633770', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5c2094bc-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.865059579, 'message_signature': '108bb55f6d6e15c1012f8c3dc15fb1475b80f6ed1cca053b1ba7fdfefd810e36'}]}, 'timestamp': '2026-02-01 09:57:03.646205', '_unique_id': 'de3961d2e92e43e093862c736ab24f02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.647 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.647 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '387fb21f-4c1c-4325-ad3b-1d39133f5d0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.647843', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c20e35e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': '14ce255b612d30504817c1f7310c707e3ec85176c4b31f4c5dcedb7dcf1a51ac'}]}, 'timestamp': '2026-02-01 09:57:03.648200', '_unique_id': '345383649fec4a58b8641a124e607ec2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.648 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.649 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.649 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b04a8cf-12ae-49f4-a541-d447f6773834', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.649741', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c212cce-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': '9ae812b009eeb9f3d14ff29425727948fd8d3fadc9aa5e81ca216829ae2d40f0'}]}, 'timestamp': '2026-02-01 09:57:03.650107', '_unique_id': '232688efef7c426c9c37e99ee27658fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.651 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 16240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4844080a-9fcf-482d-8940-54d50e204b34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16240000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:57:03.651559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5c217468-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.865059579, 'message_signature': '6bebd43398c0066be7062b678c7a784f4cb0edd251f23d77608686da01ac5f14'}]}, 'timestamp': '2026-02-01 09:57:03.651905', '_unique_id': 'dd80929203b14c68bc5a20fd4f2e0578'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.653 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.653 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3812906-3419-4580-8edf-18ea7f69dc2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.653367', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c21ba04-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': '88db6685a6e858c843259435dc4c4a2314f2cd0b96ed639473c5a88bb66dbd55'}]}, 'timestamp': '2026-02-01 09:57:03.653692', '_unique_id': 'ccdfd07d43fb415ab1c5b6f26629c184'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.654 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.655 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.655 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.655 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4284305b-a779-4703-a3be-e6317b6c8a52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:57:03.655279', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '5c2204b4-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.750058182, 'message_signature': 'c3956fc67204f96059570220c9502101c104ef3d4e67dcea05ea51ed09e39a0c'}]}, 'timestamp': '2026-02-01 09:57:03.655619', '_unique_id': '71fe757196a14e519c436c16484ec9e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.657 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.657 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52ae03e0-c540-40bc-96ea-76771990145c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.657177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c224f64-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '9d95d7465b44c00bb887630541d5c9867d24f3abe0de1f643d2a6d214c4ae971'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.657177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c225c7a-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.763780695, 'message_signature': '67109c69af8ab60d407f31893fe917e524b2240f90c09ec365fbcf43d3c4113d'}]}, 'timestamp': '2026-02-01 09:57:03.657867', '_unique_id': '1bb2806ef9314d0a887d5c77bf56cfd8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.658 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.659 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.659 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.659 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f5ee8bd-ae3e-4fb0-aae8-83fe05e9c19d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:57:03.659404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5c22a568-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.803728884, 'message_signature': '18ec631f4f101dcbf225cafa0a9f661f52191037f409aeba8b86eee7eb066d01'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:57:03.659404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5c22b12a-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 11917.803728884, 'message_signature': '204cfffcb0165979f481cf0a09bf82548f816a69e05ebc34ec41ab1932c0a62f'}]}, 'timestamp': '2026-02-01 09:57:03.660014', '_unique_id': '1fdc70034b284be9913a24cec803fe54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.660 12 ERROR oslo_messaging.notify.messaging Feb 1 04:57:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:57:03.661 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:57:03 localhost nova_compute[274651]: 2026-02-01 09:57:03.938 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:04.580 259320 INFO neutron.agent.linux.ip_lib [None req-5d2c52bc-d983-40e1-b96d-846445d405c1 - - - - - -] Device tapf2bbc65f-d6 cannot be used as it has no MAC address#033[00m Feb 1 04:57:04 localhost nova_compute[274651]: 2026-02-01 09:57:04.602 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost kernel: device tapf2bbc65f-d6 entered promiscuous mode Feb 1 04:57:04 localhost NetworkManager[5964]: [1769939824.6103] manager: (tapf2bbc65f-d6): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Feb 1 04:57:04 localhost nova_compute[274651]: 2026-02-01 09:57:04.611 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost systemd-udevd[316584]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:04 localhost ovn_controller[152492]: 2026-02-01T09:57:04Z|00285|binding|INFO|Claiming lport f2bbc65f-d614-4e37-a2e4-1e6029beea9b for this chassis. Feb 1 04:57:04 localhost ovn_controller[152492]: 2026-02-01T09:57:04Z|00286|binding|INFO|f2bbc65f-d614-4e37-a2e4-1e6029beea9b: Claiming unknown Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost nova_compute[274651]: 2026-02-01 09:57:04.637 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost nova_compute[274651]: 2026-02-01 09:57:04.640 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost nova_compute[274651]: 2026-02-01 09:57:04.643 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost journal[217584]: ethtool ioctl error on tapf2bbc65f-d6: No such device Feb 1 04:57:04 localhost nova_compute[274651]: 2026-02-01 09:57:04.669 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost nova_compute[274651]: 2026-02-01 09:57:04.695 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:04.922 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:04.924 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:57:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:04.925 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1d165f4e-0c12-4039-b954-b99cd6d753f9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:04.925 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:04 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:04.926 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[1591290d-a97b-4652-994f-e6d948c008a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:05 localhost nova_compute[274651]: 2026-02-01 09:57:05.146 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:05 localhost ovn_controller[152492]: 2026-02-01T09:57:05Z|00287|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:05 localhost nova_compute[274651]: 2026-02-01 09:57:05.205 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:05 localhost podman[316655]: Feb 1 04:57:05 localhost podman[316655]: 2026-02-01 09:57:05.40550189 +0000 UTC m=+0.071761008 container create c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad1cdfd5-1209-48be-9652-9d20559ffba8, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:57:05 localhost systemd[1]: Started libpod-conmon-c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f.scope. Feb 1 04:57:05 localhost systemd[1]: tmp-crun.ezOJ1V.mount: Deactivated successfully. Feb 1 04:57:05 localhost systemd[1]: Started libcrun container. Feb 1 04:57:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6af3b67b80a4122259873a946955bf6cf35432ddf8fb3656b2545d4967de17da/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:05 localhost podman[316686]: 2026-02-01 09:57:05.465590478 +0000 UTC m=+0.048570655 container kill 3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:57:05 localhost dnsmasq[316573]: exiting on receipt of SIGTERM Feb 1 04:57:05 localhost systemd[1]: libpod-3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21.scope: Deactivated successfully. Feb 1 04:57:05 localhost podman[316655]: 2026-02-01 09:57:05.367914444 +0000 UTC m=+0.034173592 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:05 localhost podman[316655]: 2026-02-01 09:57:05.473377188 +0000 UTC m=+0.139636306 container init c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad1cdfd5-1209-48be-9652-9d20559ffba8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 1 04:57:05 localhost dnsmasq[316724]: started, version 2.85 cachesize 150 Feb 1 04:57:05 localhost dnsmasq[316724]: DNS service limited to local subnets Feb 1 04:57:05 localhost dnsmasq[316724]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:05 localhost dnsmasq[316724]: warning: no upstream servers configured Feb 1 04:57:05 localhost dnsmasq-dhcp[316724]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:05 localhost dnsmasq[316724]: read /var/lib/neutron/dhcp/ad1cdfd5-1209-48be-9652-9d20559ffba8/addn_hosts - 0 addresses Feb 1 04:57:05 localhost dnsmasq-dhcp[316724]: read /var/lib/neutron/dhcp/ad1cdfd5-1209-48be-9652-9d20559ffba8/host Feb 1 04:57:05 localhost dnsmasq-dhcp[316724]: read /var/lib/neutron/dhcp/ad1cdfd5-1209-48be-9652-9d20559ffba8/opts Feb 1 04:57:05 localhost podman[316695]: 2026-02-01 09:57:05.508353233 +0000 UTC m=+0.071746127 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 04:57:05 localhost podman[316715]: 2026-02-01 09:57:05.525042306 +0000 UTC m=+0.046199111 container died 3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:05 localhost podman[316655]: 2026-02-01 09:57:05.531376272 +0000 UTC m=+0.197635380 container start c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad1cdfd5-1209-48be-9652-9d20559ffba8, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:57:05 localhost systemd[1]: tmp-crun.gAos7Q.mount: Deactivated successfully. Feb 1 04:57:05 localhost podman[316695]: 2026-02-01 09:57:05.53850326 +0000 UTC m=+0.101896144 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:05 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:57:05 localhost podman[316715]: 2026-02-01 09:57:05.556659869 +0000 UTC m=+0.077816644 container cleanup 3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:05 localhost systemd[1]: libpod-conmon-3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21.scope: Deactivated successfully. Feb 1 04:57:05 localhost podman[316717]: 2026-02-01 09:57:05.586028902 +0000 UTC m=+0.106297540 container remove 3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:57:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:05.661 259320 INFO neutron.agent.dhcp.agent [None req-5806868e-a46a-4f5c-841f-c91ddc61f362 - - - - - -] DHCP configuration for ports {'7fe59047-e48f-46ae-b29d-a0bb797693bd'} is completed#033[00m Feb 1 04:57:05 localhost dnsmasq[316724]: exiting on receipt of SIGTERM Feb 1 04:57:05 localhost podman[316779]: 2026-02-01 09:57:05.727781573 +0000 UTC m=+0.051250248 container kill c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad1cdfd5-1209-48be-9652-9d20559ffba8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:57:05 localhost systemd[1]: libpod-c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f.scope: Deactivated successfully. Feb 1 04:57:05 localhost podman[316796]: 2026-02-01 09:57:05.776690276 +0000 UTC m=+0.037895206 container died c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad1cdfd5-1209-48be-9652-9d20559ffba8, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:05 localhost podman[316796]: 2026-02-01 09:57:05.803313646 +0000 UTC m=+0.064518566 container cleanup c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad1cdfd5-1209-48be-9652-9d20559ffba8, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:05 localhost systemd[1]: libpod-conmon-c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f.scope: Deactivated successfully. Feb 1 04:57:05 localhost podman[316799]: 2026-02-01 09:57:05.85450004 +0000 UTC m=+0.110612564 container remove c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad1cdfd5-1209-48be-9652-9d20559ffba8, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:57:05 localhost nova_compute[274651]: 2026-02-01 09:57:05.864 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:05 localhost kernel: device tapf2bbc65f-d6 left promiscuous mode Feb 1 04:57:05 localhost nova_compute[274651]: 2026-02-01 09:57:05.878 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:05.897 259320 INFO neutron.agent.dhcp.agent [None req-dea5ebc8-b966-4311-8b69-6b6fd53579de - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:05.897 259320 INFO neutron.agent.dhcp.agent [None req-dea5ebc8-b966-4311-8b69-6b6fd53579de - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:06 localhost podman[316865]: Feb 1 04:57:06 localhost podman[316865]: 2026-02-01 09:57:06.427544285 +0000 UTC m=+0.085646465 container create e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 1 04:57:06 localhost systemd[1]: Started libpod-conmon-e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73.scope. Feb 1 04:57:06 localhost podman[316865]: 2026-02-01 09:57:06.378111475 +0000 UTC m=+0.036213675 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:06 localhost systemd[1]: Started libcrun container. Feb 1 04:57:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04548a25015d65c338516fae48d284986c6b9f44d5b63e418a09999490c0afd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:06 localhost systemd[1]: var-lib-containers-storage-overlay-6af3b67b80a4122259873a946955bf6cf35432ddf8fb3656b2545d4967de17da-merged.mount: Deactivated successfully. Feb 1 04:57:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0b393b5e3c0e6b8195c78cd1543a00021aed42e2236bed14e2ae38989216c6f-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:06 localhost systemd[1]: run-netns-qdhcp\x2dad1cdfd5\x2d1209\x2d48be\x2d9652\x2d9d20559ffba8.mount: Deactivated successfully. Feb 1 04:57:06 localhost systemd[1]: var-lib-containers-storage-overlay-1811bfc5729af1a7bb6c757b70c889f29bcd905d6c38663ae238820a3ad313be-merged.mount: Deactivated successfully. Feb 1 04:57:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d0b166d0edf74631d02e9c5bc341c93baf617ef2f507f520f2f1ad6d66f8b21-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:06 localhost podman[316865]: 2026-02-01 09:57:06.511469696 +0000 UTC m=+0.169572916 container init e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:57:06 localhost podman[316865]: 2026-02-01 09:57:06.520896696 +0000 UTC m=+0.178998886 container start e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:57:06 localhost dnsmasq[316883]: started, version 2.85 cachesize 150 Feb 1 04:57:06 localhost dnsmasq[316883]: DNS service limited to local subnets Feb 1 04:57:06 localhost dnsmasq[316883]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:06 localhost dnsmasq[316883]: warning: no upstream servers configured Feb 1 04:57:06 localhost dnsmasq-dhcp[316883]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:57:06 localhost dnsmasq-dhcp[316883]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:06 localhost dnsmasq[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:06 localhost dnsmasq-dhcp[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:06 localhost dnsmasq-dhcp[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:57:06 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:06.892 259320 INFO neutron.agent.dhcp.agent [None req-35ab0a4f-1e94-4e06-bc13-fef7275e7718 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873', '2f67d5f3-95dd-4865-838d-bf7ebbc38c8a'} is completed#033[00m Feb 1 04:57:08 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:08.339 2 INFO neutron.agent.securitygroups_rpc [None req-6b01ca21-428c-44eb-a29a-b5d48a46bb1b e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:08.416 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:06Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=8f511f5c-0aed-488d-871d-06d2dc7c2ba5, ip_allocation=immediate, mac_address=fa:16:3e:39:99:3b, name=tempest-NetworksTestDHCPv6-1008125554, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['7bd4468b-57bc-4647-9070-17924847ddc9', 'c367123e-d8d3-442a-816d-ef65387c8fc7'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:00Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2129, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:07Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:57:08 localhost dnsmasq[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:57:08 localhost dnsmasq-dhcp[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:08 localhost podman[316903]: 2026-02-01 09:57:08.653960243 +0000 UTC m=+0.066822516 container kill e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:08 localhost dnsmasq-dhcp[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:57:08 localhost systemd[1]: tmp-crun.2xcUEf.mount: Deactivated successfully. Feb 1 04:57:08 localhost podman[316924]: 2026-02-01 09:57:08.847544617 +0000 UTC m=+0.067409064 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:57:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:08.870 259320 INFO neutron.agent.linux.ip_lib [None req-0b1cdde8-190b-4a46-b142-62ab027f2745 - - - - - -] Device tapcbfdf896-d6 cannot be used as it has no MAC address#033[00m Feb 1 04:57:08 localhost podman[316924]: 2026-02-01 09:57:08.87720272 +0000 UTC m=+0.097067157 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:57:08 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:57:08 localhost nova_compute[274651]: 2026-02-01 09:57:08.892 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:08 localhost kernel: device tapcbfdf896-d6 entered promiscuous mode Feb 1 04:57:08 localhost nova_compute[274651]: 2026-02-01 09:57:08.898 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:08 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:08.901 2 INFO neutron.agent.securitygroups_rpc [None req-6ad3677b-8ed2-4afa-a1c1-27a49bcc11f3 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:57:08 localhost systemd-udevd[316959]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:08 localhost ovn_controller[152492]: 2026-02-01T09:57:08Z|00288|binding|INFO|Claiming lport cbfdf896-d66b-473d-a0d8-f2c256abe816 for this chassis. Feb 1 04:57:08 localhost ovn_controller[152492]: 2026-02-01T09:57:08Z|00289|binding|INFO|cbfdf896-d66b-473d-a0d8-f2c256abe816: Claiming unknown Feb 1 04:57:08 localhost NetworkManager[5964]: [1769939828.9032] manager: (tapcbfdf896-d6): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Feb 1 04:57:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:08.909 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ad3eb133-e8d2-4ba1-82e0-dae831f99e26', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad3eb133-e8d2-4ba1-82e0-dae831f99e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19692741-aa13-4e1b-9d0e-c2e6051c7c2c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cbfdf896-d66b-473d-a0d8-f2c256abe816) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:08.910 158365 INFO neutron.agent.ovn.metadata.agent [-] Port cbfdf896-d66b-473d-a0d8-f2c256abe816 in datapath ad3eb133-e8d2-4ba1-82e0-dae831f99e26 bound to our chassis#033[00m Feb 1 04:57:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:08.912 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ad3eb133-e8d2-4ba1-82e0-dae831f99e26 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:08.912 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[19ea621e-d0a7-4275-b558-75470e0f556f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost ovn_controller[152492]: 2026-02-01T09:57:08Z|00290|binding|INFO|Setting lport cbfdf896-d66b-473d-a0d8-f2c256abe816 ovn-installed in OVS Feb 1 04:57:08 localhost ovn_controller[152492]: 2026-02-01T09:57:08Z|00291|binding|INFO|Setting lport cbfdf896-d66b-473d-a0d8-f2c256abe816 up in Southbound Feb 1 04:57:08 localhost nova_compute[274651]: 2026-02-01 09:57:08.929 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:08 localhost nova_compute[274651]: 2026-02-01 09:57:08.931 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost journal[217584]: ethtool ioctl error on tapcbfdf896-d6: No such device Feb 1 04:57:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:08.961 259320 INFO neutron.agent.dhcp.agent [None req-b527cd84-96c2-43c3-8381-21c9934af325 - - - - - -] DHCP configuration for ports {'8f511f5c-0aed-488d-871d-06d2dc7c2ba5'} is completed#033[00m Feb 1 04:57:08 localhost nova_compute[274651]: 2026-02-01 09:57:08.964 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:08 localhost nova_compute[274651]: 2026-02-01 09:57:08.989 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:09 localhost podman[317030]: Feb 1 04:57:09 localhost podman[317030]: 2026-02-01 09:57:09.860796032 +0000 UTC m=+0.093744003 container create 7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad3eb133-e8d2-4ba1-82e0-dae831f99e26, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:57:09 localhost systemd[1]: Started libpod-conmon-7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7.scope. Feb 1 04:57:09 localhost podman[317030]: 2026-02-01 09:57:09.814941222 +0000 UTC m=+0.047889243 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:09 localhost systemd[1]: tmp-crun.pe5ocn.mount: Deactivated successfully. Feb 1 04:57:09 localhost systemd[1]: Started libcrun container. Feb 1 04:57:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fdae5921ed84a5c84dee2c639610714cc5f194edc53be17d3716e9e0b82880d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:09 localhost podman[317030]: 2026-02-01 09:57:09.950178612 +0000 UTC m=+0.183126563 container init 7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad3eb133-e8d2-4ba1-82e0-dae831f99e26, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:57:09 localhost podman[317030]: 2026-02-01 09:57:09.958073235 +0000 UTC m=+0.191021206 container start 7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad3eb133-e8d2-4ba1-82e0-dae831f99e26, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:09 localhost dnsmasq[317048]: started, version 2.85 cachesize 150 Feb 1 04:57:09 localhost dnsmasq[317048]: DNS service limited to local subnets Feb 1 04:57:09 localhost dnsmasq[317048]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:09 localhost dnsmasq[317048]: warning: no upstream servers configured Feb 1 04:57:09 localhost dnsmasq-dhcp[317048]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:09 localhost dnsmasq[317048]: read /var/lib/neutron/dhcp/ad3eb133-e8d2-4ba1-82e0-dae831f99e26/addn_hosts - 0 addresses Feb 1 04:57:09 localhost dnsmasq-dhcp[317048]: read /var/lib/neutron/dhcp/ad3eb133-e8d2-4ba1-82e0-dae831f99e26/host Feb 1 04:57:09 localhost dnsmasq-dhcp[317048]: read /var/lib/neutron/dhcp/ad3eb133-e8d2-4ba1-82e0-dae831f99e26/opts Feb 1 04:57:10 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:10.120 259320 INFO neutron.agent.dhcp.agent [None req-514aa096-710d-4c27-83e4-471fe19e33ed - - - - - -] DHCP configuration for ports {'af285375-23e2-4b8f-a160-5eefedf18a7b'} is completed#033[00m Feb 1 04:57:10 localhost nova_compute[274651]: 2026-02-01 09:57:10.182 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:10 localhost nova_compute[274651]: 2026-02-01 09:57:10.187 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:10 localhost dnsmasq[317048]: read /var/lib/neutron/dhcp/ad3eb133-e8d2-4ba1-82e0-dae831f99e26/addn_hosts - 0 addresses Feb 1 04:57:10 localhost dnsmasq-dhcp[317048]: read /var/lib/neutron/dhcp/ad3eb133-e8d2-4ba1-82e0-dae831f99e26/host Feb 1 04:57:10 localhost podman[317066]: 2026-02-01 09:57:10.28011589 +0000 UTC m=+0.057376576 container kill 7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad3eb133-e8d2-4ba1-82e0-dae831f99e26, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:57:10 localhost dnsmasq-dhcp[317048]: read /var/lib/neutron/dhcp/ad3eb133-e8d2-4ba1-82e0-dae831f99e26/opts Feb 1 04:57:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:57:10 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:10.668 259320 INFO neutron.agent.dhcp.agent [None req-47797ab5-a163-4995-bbe7-84c908b3cb73 - - - - - -] DHCP configuration for ports {'cbfdf896-d66b-473d-a0d8-f2c256abe816', 'af285375-23e2-4b8f-a160-5eefedf18a7b'} is completed#033[00m Feb 1 04:57:10 localhost podman[317087]: 2026-02-01 09:57:10.722198697 +0000 UTC m=+0.079507947 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:10 localhost podman[317087]: 2026-02-01 09:57:10.777433015 +0000 UTC m=+0.134742265 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:10 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:57:10 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:10.895 2 INFO neutron.agent.securitygroups_rpc [None req-dadc567d-76ba-47fb-b0ef-4dff55f1d7c5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:10 localhost nova_compute[274651]: 2026-02-01 09:57:10.958 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost ovn_controller[152492]: 2026-02-01T09:57:11Z|00292|binding|INFO|Removing iface tapcbfdf896-d6 ovn-installed in OVS Feb 1 04:57:11 localhost ovn_controller[152492]: 2026-02-01T09:57:11Z|00293|binding|INFO|Removing lport cbfdf896-d66b-473d-a0d8-f2c256abe816 ovn-installed in OVS Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.102 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f33da026-8a7e-48f7-8a91-bcd5dda5cd5d with type ""#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.103 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ad3eb133-e8d2-4ba1-82e0-dae831f99e26', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ad3eb133-e8d2-4ba1-82e0-dae831f99e26', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19692741-aa13-4e1b-9d0e-c2e6051c7c2c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cbfdf896-d66b-473d-a0d8-f2c256abe816) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.103 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.104 158365 INFO neutron.agent.ovn.metadata.agent [-] Port cbfdf896-d66b-473d-a0d8-f2c256abe816 in datapath ad3eb133-e8d2-4ba1-82e0-dae831f99e26 unbound from our chassis#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.106 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ad3eb133-e8d2-4ba1-82e0-dae831f99e26, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.106 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.106 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[9826571a-127c-43ba-9140-6d39bce47dc0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:11 localhost dnsmasq[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:11 localhost dnsmasq-dhcp[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:11 localhost podman[317129]: 2026-02-01 09:57:11.120450866 +0000 UTC m=+0.058605113 container kill e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 04:57:11 localhost dnsmasq-dhcp[316883]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:11 localhost dnsmasq[317048]: exiting on receipt of SIGTERM Feb 1 04:57:11 localhost systemd[1]: tmp-crun.WUaJrg.mount: Deactivated successfully. Feb 1 04:57:11 localhost systemd[1]: libpod-7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7.scope: Deactivated successfully. Feb 1 04:57:11 localhost podman[317164]: 2026-02-01 09:57:11.219617406 +0000 UTC m=+0.052000220 container kill 7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad3eb133-e8d2-4ba1-82e0-dae831f99e26, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:57:11 localhost podman[317193]: 2026-02-01 09:57:11.274861375 +0000 UTC m=+0.032505391 container died 7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad3eb133-e8d2-4ba1-82e0-dae831f99e26, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:11 localhost podman[317193]: 2026-02-01 09:57:11.308156269 +0000 UTC m=+0.065800326 container remove 7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ad3eb133-e8d2-4ba1-82e0-dae831f99e26, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:57:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:11.362 259320 INFO neutron.agent.linux.ip_lib [None req-c899f7a7-eb0e-49c1-80e3-0bf5c73a2932 - - - - - -] Device tap288c17ac-b3 cannot be used as it has no MAC address#033[00m Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.361 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost kernel: device tapcbfdf896-d6 left promiscuous mode Feb 1 04:57:11 localhost systemd[1]: libpod-conmon-7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7.scope: Deactivated successfully. Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.376 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.381 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost kernel: device tap288c17ac-b3 entered promiscuous mode Feb 1 04:57:11 localhost ovn_controller[152492]: 2026-02-01T09:57:11Z|00294|binding|INFO|Claiming lport 288c17ac-b338-4af9-ac41-876c3d023b44 for this chassis. Feb 1 04:57:11 localhost ovn_controller[152492]: 2026-02-01T09:57:11Z|00295|binding|INFO|288c17ac-b338-4af9-ac41-876c3d023b44: Claiming unknown Feb 1 04:57:11 localhost NetworkManager[5964]: [1769939831.3873] manager: (tap288c17ac-b3): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.386 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.404 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-08bfb2e4-5a51-4360-ac22-07885efe5081', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08bfb2e4-5a51-4360-ac22-07885efe5081', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec2f419434374ceeb2aabac212e109be', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=523cc458-81fe-4a78-863d-237e1d9df4ca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=288c17ac-b338-4af9-ac41-876c3d023b44) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.405 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 288c17ac-b338-4af9-ac41-876c3d023b44 in datapath 08bfb2e4-5a51-4360-ac22-07885efe5081 bound to our chassis#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.406 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 08bfb2e4-5a51-4360-ac22-07885efe5081 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:11 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:11.407 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e5ca64-cca3-416c-8301-830517e48104]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.417 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost ovn_controller[152492]: 2026-02-01T09:57:11Z|00296|binding|INFO|Setting lport 288c17ac-b338-4af9-ac41-876c3d023b44 ovn-installed in OVS Feb 1 04:57:11 localhost ovn_controller[152492]: 2026-02-01T09:57:11Z|00297|binding|INFO|Setting lport 288c17ac-b338-4af9-ac41-876c3d023b44 up in Southbound Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.419 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost journal[217584]: ethtool ioctl error on tap288c17ac-b3: No such device Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.439 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost nova_compute[274651]: 2026-02-01 09:57:11.466 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:57:11 localhost systemd[1]: var-lib-containers-storage-overlay-6fdae5921ed84a5c84dee2c639610714cc5f194edc53be17d3716e9e0b82880d-merged.mount: Deactivated successfully. Feb 1 04:57:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7630bc22ba63c33b35774e3ddd1512f67c1f0f92ebc0f5e5494fff9852d93ea7-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:11 localhost systemd[1]: run-netns-qdhcp\x2dad3eb133\x2de8d2\x2d4ba1\x2d82e0\x2ddae831f99e26.mount: Deactivated successfully. Feb 1 04:57:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:12.278 2 INFO neutron.agent.securitygroups_rpc [None req-3e23b3f8-adbb-495f-8475-dfff7cdaa65d 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:57:12 localhost podman[317288]: Feb 1 04:57:12 localhost podman[317288]: 2026-02-01 09:57:12.393624585 +0000 UTC m=+0.092582468 container create 9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-08bfb2e4-5a51-4360-ac22-07885efe5081, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:12 localhost systemd[1]: Started libpod-conmon-9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98.scope. Feb 1 04:57:12 localhost podman[317288]: 2026-02-01 09:57:12.350384146 +0000 UTC m=+0.049342069 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:12 localhost systemd[1]: Started libcrun container. Feb 1 04:57:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a29a66e499b9a9f1ba44c8450968ce79a0e8ea5fa2d064e4641ee5fd676c87ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:12 localhost podman[317288]: 2026-02-01 09:57:12.469128957 +0000 UTC m=+0.168086861 container init 9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-08bfb2e4-5a51-4360-ac22-07885efe5081, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:12 localhost podman[317288]: 2026-02-01 09:57:12.476463143 +0000 UTC m=+0.175421056 container start 9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-08bfb2e4-5a51-4360-ac22-07885efe5081, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:12 localhost dnsmasq[317307]: started, version 2.85 cachesize 150 Feb 1 04:57:12 localhost dnsmasq[317307]: DNS service limited to local subnets Feb 1 04:57:12 localhost dnsmasq[317307]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:12 localhost dnsmasq[317307]: warning: no upstream servers configured Feb 1 04:57:12 localhost dnsmasq-dhcp[317307]: DHCP, static leases only on 10.100.0.16, lease time 1d Feb 1 04:57:12 localhost dnsmasq[317307]: read /var/lib/neutron/dhcp/08bfb2e4-5a51-4360-ac22-07885efe5081/addn_hosts - 0 addresses Feb 1 04:57:12 localhost dnsmasq-dhcp[317307]: read /var/lib/neutron/dhcp/08bfb2e4-5a51-4360-ac22-07885efe5081/host Feb 1 04:57:12 localhost dnsmasq-dhcp[317307]: read /var/lib/neutron/dhcp/08bfb2e4-5a51-4360-ac22-07885efe5081/opts Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.544 259320 INFO neutron.agent.dhcp.agent [None req-53878570-0935-4a4b-96ac-356b649e2207 - - - - - -] Synchronizing state#033[00m Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.690 259320 INFO neutron.agent.dhcp.agent [None req-ccc2beb0-3263-4e50-bbb3-fd9ea9326bbc - - - - - -] DHCP configuration for ports {'316957c0-31d2-4619-b751-8ea3bd2d17d8'} is completed#033[00m Feb 1 04:57:12 localhost ovn_controller[152492]: 2026-02-01T09:57:12Z|00298|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.825 259320 INFO neutron.agent.dhcp.agent [None req-47580ec0-0079-4dad-8479-09f34f96e55e - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.827 259320 INFO neutron.agent.dhcp.agent [-] Starting network 8016a484-f830-421e-ae22-a065d97b13c5 dhcp configuration#033[00m Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.827 259320 INFO neutron.agent.dhcp.agent [-] Finished network 8016a484-f830-421e-ae22-a065d97b13c5 dhcp configuration#033[00m Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.828 259320 INFO neutron.agent.dhcp.agent [-] Starting network ad3eb133-e8d2-4ba1-82e0-dae831f99e26 dhcp configuration#033[00m Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.828 259320 INFO neutron.agent.dhcp.agent [-] Finished network ad3eb133-e8d2-4ba1-82e0-dae831f99e26 dhcp configuration#033[00m Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.829 259320 INFO neutron.agent.dhcp.agent [None req-47580ec0-0079-4dad-8479-09f34f96e55e - - - - - -] Synchronizing state complete#033[00m Feb 1 04:57:12 localhost nova_compute[274651]: 2026-02-01 09:57:12.846 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:12.956 259320 INFO neutron.agent.dhcp.agent [None req-92b29d68-15d2-48f4-b580-9021cf4890a7 - - - - - -] DHCP configuration for ports {'af285375-23e2-4b8f-a160-5eefedf18a7b'} is completed#033[00m Feb 1 04:57:13 localhost dnsmasq[316883]: exiting on receipt of SIGTERM Feb 1 04:57:13 localhost podman[317325]: 2026-02-01 09:57:13.139057972 +0000 UTC m=+0.041302580 container kill e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:13 localhost systemd[1]: libpod-e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73.scope: Deactivated successfully. Feb 1 04:57:13 localhost podman[317339]: 2026-02-01 09:57:13.18611107 +0000 UTC m=+0.038159945 container died e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:57:13 localhost systemd[1]: tmp-crun.yTHwGx.mount: Deactivated successfully. Feb 1 04:57:13 localhost podman[317339]: 2026-02-01 09:57:13.227961567 +0000 UTC m=+0.080010442 container cleanup e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:13 localhost systemd[1]: libpod-conmon-e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73.scope: Deactivated successfully. Feb 1 04:57:13 localhost podman[317341]: 2026-02-01 09:57:13.274583201 +0000 UTC m=+0.119118955 container remove e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:13 localhost nova_compute[274651]: 2026-02-01 09:57:13.287 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:13 localhost ovn_controller[152492]: 2026-02-01T09:57:13Z|00299|binding|INFO|Releasing lport 2f67d5f3-95dd-4865-838d-bf7ebbc38c8a from this chassis (sb_readonly=0) Feb 1 04:57:13 localhost ovn_controller[152492]: 2026-02-01T09:57:13Z|00300|binding|INFO|Setting lport 2f67d5f3-95dd-4865-838d-bf7ebbc38c8a down in Southbound Feb 1 04:57:13 localhost kernel: device tap2f67d5f3-95 left promiscuous mode Feb 1 04:57:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:13.304 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fedc:fd9f/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': ''}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2f67d5f3-95dd-4865-838d-bf7ebbc38c8a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:13.309 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 2f67d5f3-95dd-4865-838d-bf7ebbc38c8a in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:57:13 localhost nova_compute[274651]: 2026-02-01 09:57:13.311 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:13 localhost nova_compute[274651]: 2026-02-01 09:57:13.313 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:13.314 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:13 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:13.315 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[6c339ffd-4a3d-4adf-827e-e05a71956f56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:13 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:13.771 259320 INFO neutron.agent.dhcp.agent [None req-1f11cc9c-ff55-4a21-a610-2766039afa8b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:13 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:13.826 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:14 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:14.012 2 INFO neutron.agent.securitygroups_rpc [None req-f8a158f2-98c4-421b-88cc-f32123223f2c d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']#033[00m Feb 1 04:57:14 localhost systemd[1]: var-lib-containers-storage-overlay-04548a25015d65c338516fae48d284986c6b9f44d5b63e418a09999490c0afd3-merged.mount: Deactivated successfully. Feb 1 04:57:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9ca88217e505d1f4c544ce2407b629530e37716d369116acf68aeb9d8ea5d73-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:14 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:57:14 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:14.284 2 INFO neutron.agent.securitygroups_rpc [None req-f8a158f2-98c4-421b-88cc-f32123223f2c d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']#033[00m Feb 1 04:57:15 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:15.139 2 INFO neutron.agent.securitygroups_rpc [None req-49f91d64-6065-4341-b2b5-79ecb7af7da0 d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']#033[00m Feb 1 04:57:15 localhost nova_compute[274651]: 2026-02-01 09:57:15.209 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:15 localhost nova_compute[274651]: 2026-02-01 09:57:15.213 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:15 localhost dnsmasq[317307]: exiting on receipt of SIGTERM Feb 1 04:57:15 localhost podman[317386]: 2026-02-01 09:57:15.543102604 +0000 UTC m=+0.052474914 container kill 9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-08bfb2e4-5a51-4360-ac22-07885efe5081, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:15 localhost systemd[1]: libpod-9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98.scope: Deactivated successfully. Feb 1 04:57:15 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:15.577 2 INFO neutron.agent.securitygroups_rpc [None req-55fa4d7c-6084-454f-b0ae-c51e2ec0c52d d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']#033[00m Feb 1 04:57:15 localhost podman[317399]: 2026-02-01 09:57:15.641551943 +0000 UTC m=+0.080966122 container died 9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-08bfb2e4-5a51-4360-ac22-07885efe5081, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:15 localhost podman[317399]: 2026-02-01 09:57:15.677112556 +0000 UTC m=+0.116526705 container cleanup 9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-08bfb2e4-5a51-4360-ac22-07885efe5081, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:15 localhost systemd[1]: libpod-conmon-9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98.scope: Deactivated successfully. Feb 1 04:57:15 localhost podman[317400]: 2026-02-01 09:57:15.721971546 +0000 UTC m=+0.147947022 container remove 9d3638b8e17117a52c35e49367b7fd94a76be42dacd1ab639705f8673cddda98 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-08bfb2e4-5a51-4360-ac22-07885efe5081, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:15 localhost nova_compute[274651]: 2026-02-01 09:57:15.735 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:15 localhost ovn_controller[152492]: 2026-02-01T09:57:15Z|00301|binding|INFO|Releasing lport 288c17ac-b338-4af9-ac41-876c3d023b44 from this chassis (sb_readonly=0) Feb 1 04:57:15 localhost ovn_controller[152492]: 2026-02-01T09:57:15Z|00302|binding|INFO|Setting lport 288c17ac-b338-4af9-ac41-876c3d023b44 down in Southbound Feb 1 04:57:15 localhost kernel: device tap288c17ac-b3 left promiscuous mode Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.747 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-08bfb2e4-5a51-4360-ac22-07885efe5081', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08bfb2e4-5a51-4360-ac22-07885efe5081', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec2f419434374ceeb2aabac212e109be', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=523cc458-81fe-4a78-863d-237e1d9df4ca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=288c17ac-b338-4af9-ac41-876c3d023b44) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.750 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 288c17ac-b338-4af9-ac41-876c3d023b44 in datapath 08bfb2e4-5a51-4360-ac22-07885efe5081 unbound from our chassis#033[00m Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.753 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08bfb2e4-5a51-4360-ac22-07885efe5081, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:15 localhost nova_compute[274651]: 2026-02-01 09:57:15.755 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.754 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ff6bb4-9b54-4ee8-840c-3b6f429a6e78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.789 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.791 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.794 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:15 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:15.796 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[034c3a45-d0b2-4bba-9cda-87673550c999]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:16.000 259320 INFO neutron.agent.dhcp.agent [None req-13204e56-d339-41ed-a88d-9a3b6f9f96f9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:16.141 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:16 localhost systemd[1]: var-lib-containers-storage-overlay-a29a66e499b9a9f1ba44c8450968ce79a0e8ea5fa2d064e4641ee5fd676c87ac-merged.mount: Deactivated successfully. Feb 1 04:57:16 localhost systemd[1]: run-netns-qdhcp\x2d08bfb2e4\x2d5a51\x2d4360\x2dac22\x2d07885efe5081.mount: Deactivated successfully. Feb 1 04:57:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:57:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:57:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:16.672 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:16 localhost systemd[1]: tmp-crun.bLq0mH.mount: Deactivated successfully. Feb 1 04:57:16 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:16.684 2 INFO neutron.agent.securitygroups_rpc [None req-4b8cba9a-9cba-4ac1-97f8-0546b3fd4da5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:16 localhost podman[317430]: 2026-02-01 09:57:16.685102809 +0000 UTC m=+0.107856248 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, release=1769056855, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9/ubi-minimal, version=9.7, config_id=openstack_network_exporter, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 1 04:57:16 localhost podman[317430]: 2026-02-01 09:57:16.721508519 +0000 UTC m=+0.144261948 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9) Feb 1 04:57:16 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:57:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:16.957 259320 INFO neutron.agent.linux.ip_lib [None req-5c47940b-5f86-4ef1-b348-ee848a47cc78 - - - - - -] Device tapf8efd909-68 cannot be used as it has no MAC address#033[00m Feb 1 04:57:16 localhost nova_compute[274651]: 2026-02-01 09:57:16.985 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:16 localhost kernel: device tapf8efd909-68 entered promiscuous mode Feb 1 04:57:16 localhost ovn_controller[152492]: 2026-02-01T09:57:16Z|00303|binding|INFO|Claiming lport f8efd909-6823-4b0a-93a8-226cc0edb4fe for this chassis. Feb 1 04:57:16 localhost ovn_controller[152492]: 2026-02-01T09:57:16Z|00304|binding|INFO|f8efd909-6823-4b0a-93a8-226cc0edb4fe: Claiming unknown Feb 1 04:57:16 localhost NetworkManager[5964]: [1769939836.9982] manager: (tapf8efd909-68): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Feb 1 04:57:17 localhost nova_compute[274651]: 2026-02-01 09:57:16.999 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:17 localhost ovn_controller[152492]: 2026-02-01T09:57:17Z|00305|binding|INFO|Setting lport f8efd909-6823-4b0a-93a8-226cc0edb4fe ovn-installed in OVS Feb 1 04:57:17 localhost nova_compute[274651]: 2026-02-01 09:57:17.002 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:17 localhost systemd-udevd[317460]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:17 localhost nova_compute[274651]: 2026-02-01 09:57:17.013 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:17.016 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe9b:ccc1/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f8efd909-6823-4b0a-93a8-226cc0edb4fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:17.018 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f8efd909-6823-4b0a-93a8-226cc0edb4fe in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:57:17 localhost ovn_controller[152492]: 2026-02-01T09:57:17Z|00306|binding|INFO|Setting lport f8efd909-6823-4b0a-93a8-226cc0edb4fe up in Southbound Feb 1 04:57:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:17.020 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 970974ec-5a8c-442e-8efa-2ed25cb40689 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:17.020 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:17.021 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[cf731b6c-a81c-40a9-aaad-5a49debdf32e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:17 localhost ovn_controller[152492]: 2026-02-01T09:57:17Z|00307|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:17 localhost nova_compute[274651]: 2026-02-01 09:57:17.034 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost nova_compute[274651]: 2026-02-01 09:57:17.062 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost journal[217584]: ethtool ioctl error on tapf8efd909-68: No such device Feb 1 04:57:17 localhost nova_compute[274651]: 2026-02-01 09:57:17.082 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:17 localhost nova_compute[274651]: 2026-02-01 09:57:17.114 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:17.352 2 INFO neutron.agent.securitygroups_rpc [None req-0e44897f-42c5-42e0-aa55-3214c5bdaadc e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:17 localhost podman[317531]: Feb 1 04:57:17 localhost podman[317531]: 2026-02-01 09:57:17.945152285 +0000 UTC m=+0.092979601 container create cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:57:17 localhost systemd[1]: Started libpod-conmon-cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd.scope. Feb 1 04:57:18 localhost podman[317531]: 2026-02-01 09:57:17.901836412 +0000 UTC m=+0.049663768 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:18 localhost systemd[1]: Started libcrun container. Feb 1 04:57:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac1fda99389383a4d140b570c763619ace9c0ad942a615c58aca94822b454a1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:18 localhost podman[317531]: 2026-02-01 09:57:18.022756292 +0000 UTC m=+0.170583608 container init cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:18 localhost dnsmasq[317550]: started, version 2.85 cachesize 150 Feb 1 04:57:18 localhost dnsmasq[317550]: DNS service limited to local subnets Feb 1 04:57:18 localhost dnsmasq[317550]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:18 localhost dnsmasq[317550]: warning: no upstream servers configured Feb 1 04:57:18 localhost dnsmasq[317550]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:18 localhost podman[317531]: 2026-02-01 09:57:18.039173916 +0000 UTC m=+0.187001272 container start cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:18 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:18.286 259320 INFO neutron.agent.dhcp.agent [None req-c99115d6-f6ba-48f1-b7d4-a57bfc076c4b - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:18 localhost dnsmasq[317550]: exiting on receipt of SIGTERM Feb 1 04:57:18 localhost podman[317568]: 2026-02-01 09:57:18.444023729 +0000 UTC m=+0.063909967 container kill cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:57:18 localhost systemd[1]: libpod-cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd.scope: Deactivated successfully. Feb 1 04:57:18 localhost podman[317582]: 2026-02-01 09:57:18.518063566 +0000 UTC m=+0.060193732 container died cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:57:18 localhost podman[317582]: 2026-02-01 09:57:18.54678266 +0000 UTC m=+0.088912766 container cleanup cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:57:18 localhost systemd[1]: libpod-conmon-cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd.scope: Deactivated successfully. Feb 1 04:57:18 localhost podman[317584]: 2026-02-01 09:57:18.591943678 +0000 UTC m=+0.126633735 container remove cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:18 localhost nova_compute[274651]: 2026-02-01 09:57:18.604 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:18 localhost kernel: device tapf8efd909-68 left promiscuous mode Feb 1 04:57:18 localhost ovn_controller[152492]: 2026-02-01T09:57:18Z|00308|binding|INFO|Releasing lport f8efd909-6823-4b0a-93a8-226cc0edb4fe from this chassis (sb_readonly=0) Feb 1 04:57:18 localhost ovn_controller[152492]: 2026-02-01T09:57:18Z|00309|binding|INFO|Setting lport f8efd909-6823-4b0a-93a8-226cc0edb4fe down in Southbound Feb 1 04:57:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:18.615 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe9b:ccc1/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f8efd909-6823-4b0a-93a8-226cc0edb4fe) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:18.617 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f8efd909-6823-4b0a-93a8-226cc0edb4fe in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:57:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:18.621 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:18 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:18.622 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[5ebd2acb-e5c8-49f7-8fba-775cc8e6ef52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:18 localhost nova_compute[274651]: 2026-02-01 09:57:18.625 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:18 localhost systemd[1]: var-lib-containers-storage-overlay-dac1fda99389383a4d140b570c763619ace9c0ad942a615c58aca94822b454a1-merged.mount: Deactivated successfully. Feb 1 04:57:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc185b68b0448579100cfb198967feac3b29e123d681f9ac150bcd6ce84c36fd-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:18 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:57:18 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:18.981 259320 INFO neutron.agent.dhcp.agent [None req-dacf4858-b136-45db-b9b9-2204529f2ee5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:18 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:18.981 259320 INFO neutron.agent.dhcp.agent [None req-dacf4858-b136-45db-b9b9-2204529f2ee5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:20 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:20.051 2 INFO neutron.agent.securitygroups_rpc [None req-66f0be61-daee-4cdf-a282-a2ed1512143e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.242 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost dnsmasq[313742]: exiting on receipt of SIGTERM Feb 1 04:57:20 localhost systemd[1]: libpod-ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8.scope: Deactivated successfully. Feb 1 04:57:20 localhost podman[317629]: 2026-02-01 09:57:20.465398251 +0000 UTC m=+0.068501308 container kill ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db0f199-70cd-458e-a4c8-80a105dc0346, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:57:20 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:20.494 2 INFO neutron.agent.securitygroups_rpc [None req-a90c741a-39d3-40c8-bb6e-b94dde79eb43 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:20 localhost podman[317644]: 2026-02-01 09:57:20.541836382 +0000 UTC m=+0.064261098 container died ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db0f199-70cd-458e-a4c8-80a105dc0346, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:20 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:20.549 259320 INFO neutron.agent.linux.ip_lib [None req-a704a608-960c-4f9e-b650-2a01949c31a9 - - - - - -] Device tap59ea6c78-74 cannot be used as it has no MAC address#033[00m Feb 1 04:57:20 localhost systemd[1]: tmp-crun.ofz3fm.mount: Deactivated successfully. Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.582 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost podman[317644]: 2026-02-01 09:57:20.585740471 +0000 UTC m=+0.108165127 container cleanup ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db0f199-70cd-458e-a4c8-80a105dc0346, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:57:20 localhost kernel: device tap59ea6c78-74 entered promiscuous mode Feb 1 04:57:20 localhost NetworkManager[5964]: [1769939840.5904] manager: (tap59ea6c78-74): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Feb 1 04:57:20 localhost systemd-udevd[317675]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:20 localhost systemd[1]: libpod-conmon-ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8.scope: Deactivated successfully. Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00310|binding|INFO|Claiming lport 59ea6c78-74f4-4621-a6ce-1b4cd5328fba for this chassis. Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.594 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00311|binding|INFO|59ea6c78-74f4-4621-a6ce-1b4cd5328fba: Claiming unknown Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00312|binding|INFO|Setting lport 59ea6c78-74f4-4621-a6ce-1b4cd5328fba ovn-installed in OVS Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00313|binding|INFO|Setting lport 59ea6c78-74f4-4621-a6ce-1b4cd5328fba up in Southbound Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.609 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.607 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe31:f29c/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59ea6c78-74f4-4621-a6ce-1b4cd5328fba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.611 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 59ea6c78-74f4-4621-a6ce-1b4cd5328fba in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.616 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 65668942-5977-4562-a3a9-76812667a999 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.617 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.618 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ade563b9-a4ff-45ec-bd40-bb649187428a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.624 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost podman[317652]: 2026-02-01 09:57:20.643045365 +0000 UTC m=+0.147811798 container remove ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0db0f199-70cd-458e-a4c8-80a105dc0346, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:20 localhost kernel: device tap4223e357-90 left promiscuous mode Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00314|binding|INFO|Releasing lport 4223e357-90fd-4ec7-96e8-b798510a78c7 from this chassis (sb_readonly=0) Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00315|binding|INFO|Setting lport 4223e357-90fd-4ec7-96e8-b798510a78c7 down in Southbound Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.658 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.671 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-0db0f199-70cd-458e-a4c8-80a105dc0346', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0db0f199-70cd-458e-a4c8-80a105dc0346', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f9a6b351-759b-46ec-8c55-6a201e89959d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4223e357-90fd-4ec7-96e8-b798510a78c7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.673 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 4223e357-90fd-4ec7-96e8-b798510a78c7 in datapath 0db0f199-70cd-458e-a4c8-80a105dc0346 unbound from our chassis#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.675 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.675 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0db0f199-70cd-458e-a4c8-80a105dc0346 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.677 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.677 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[56164827-9a24-4ea1-9a4b-290bdd5e43c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.708 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:20.763 259320 INFO neutron.agent.linux.ip_lib [None req-71a3bb1c-86ab-4960-8a0b-71f8342829a0 - - - - - -] Device tap3e6a397d-2e cannot be used as it has no MAC address#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.793 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost kernel: device tap3e6a397d-2e entered promiscuous mode Feb 1 04:57:20 localhost NetworkManager[5964]: [1769939840.8007] manager: (tap3e6a397d-2e): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Feb 1 04:57:20 localhost systemd-udevd[317677]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00316|binding|INFO|Claiming lport 3e6a397d-2e8b-43e9-8a44-a0db8b907f56 for this chassis. Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00317|binding|INFO|3e6a397d-2e8b-43e9-8a44-a0db8b907f56: Claiming unknown Feb 1 04:57:20 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:20.805 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.807 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.813 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3cb13cb2ee4e4e329cfbfe3e5fc9c8b9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=52fa44b7-34b3-4321-bcfe-6af5d406df00, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3e6a397d-2e8b-43e9-8a44-a0db8b907f56) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.816 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 3e6a397d-2e8b-43e9-8a44-a0db8b907f56 in datapath cce95ca6-6802-4cd1-9e1a-cfe429ad34ae bound to our chassis#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.820 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cce95ca6-6802-4cd1-9e1a-cfe429ad34ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:20 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:20.822 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[4246afd8-d4f7-49a6-943a-7d0cb6d27fee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00318|binding|INFO|Setting lport 3e6a397d-2e8b-43e9-8a44-a0db8b907f56 up in Southbound Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.853 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost ovn_controller[152492]: 2026-02-01T09:57:20Z|00319|binding|INFO|Setting lport 3e6a397d-2e8b-43e9-8a44-a0db8b907f56 ovn-installed in OVS Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.859 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.900 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost nova_compute[274651]: 2026-02-01 09:57:20.929 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:20 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:20.949 2 INFO neutron.agent.securitygroups_rpc [None req-6d308bdf-c84d-45b6-96f5-9c77d97fcd46 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:57:21 localhost systemd[1]: var-lib-containers-storage-overlay-b60a3af8b60d3caa5566265bf37651aeaedb27ab811906ee57bbf70efa68a03d-merged.mount: Deactivated successfully. Feb 1 04:57:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce3592b032bdcfe219bf9a8569e29f0aa55b1b557200b08f71214e0421eb47a8-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:21 localhost systemd[1]: run-netns-qdhcp\x2d0db0f199\x2d70cd\x2d458e\x2da4c8\x2d80a105dc0346.mount: Deactivated successfully. Feb 1 04:57:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:21.443 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:21 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:21.578 2 INFO neutron.agent.securitygroups_rpc [None req-510b41ab-e548-47d9-b4e7-2bdc2eb9aebb 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']#033[00m Feb 1 04:57:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:57:21 localhost podman[317770]: Feb 1 04:57:21 localhost podman[317770]: 2026-02-01 09:57:21.66023592 +0000 UTC m=+0.048931545 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:21 localhost podman[317770]: 2026-02-01 09:57:21.760481114 +0000 UTC m=+0.149176679 container create bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:57:21 localhost systemd[1]: Started libpod-conmon-bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a.scope. Feb 1 04:57:21 localhost systemd[1]: Started libcrun container. Feb 1 04:57:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f25e32af6f521620c81b86697ac99f7c499db470073fa638945fd8029d330691/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:21 localhost podman[317770]: 2026-02-01 09:57:21.82604749 +0000 UTC m=+0.214743125 container init bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:57:21 localhost podman[317770]: 2026-02-01 09:57:21.835335036 +0000 UTC m=+0.224030641 container start bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:21 localhost dnsmasq[317820]: started, version 2.85 cachesize 150 Feb 1 04:57:21 localhost dnsmasq[317820]: DNS service limited to local subnets Feb 1 04:57:21 localhost dnsmasq[317820]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:21 localhost dnsmasq[317820]: warning: no upstream servers configured Feb 1 04:57:21 localhost dnsmasq-dhcp[317820]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:21 localhost dnsmasq[317820]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:21 localhost dnsmasq-dhcp[317820]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:21 localhost dnsmasq-dhcp[317820]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:21 localhost podman[317805]: Feb 1 04:57:21 localhost podman[317805]: 2026-02-01 09:57:21.857409714 +0000 UTC m=+0.102875484 container create b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:21 localhost systemd[1]: Started libpod-conmon-b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd.scope. Feb 1 04:57:21 localhost systemd[1]: Started libcrun container. Feb 1 04:57:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e00762dc3ac73ff22118eaf8c7b8f456105cffee03746e86e1458f9bbd846a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:21 localhost podman[317805]: 2026-02-01 09:57:21.808734498 +0000 UTC m=+0.054200318 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:21 localhost podman[317805]: 2026-02-01 09:57:21.91318641 +0000 UTC m=+0.158652180 container init b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:57:21 localhost podman[317805]: 2026-02-01 09:57:21.920293729 +0000 UTC m=+0.165759489 container start b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:57:21 localhost dnsmasq[317829]: started, version 2.85 cachesize 150 Feb 1 04:57:21 localhost dnsmasq[317829]: DNS service limited to local subnets Feb 1 04:57:21 localhost dnsmasq[317829]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:21 localhost dnsmasq[317829]: warning: no upstream servers configured Feb 1 04:57:21 localhost dnsmasq-dhcp[317829]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:21 localhost dnsmasq[317829]: read /var/lib/neutron/dhcp/cce95ca6-6802-4cd1-9e1a-cfe429ad34ae/addn_hosts - 0 addresses Feb 1 04:57:21 localhost dnsmasq-dhcp[317829]: read /var/lib/neutron/dhcp/cce95ca6-6802-4cd1-9e1a-cfe429ad34ae/host Feb 1 04:57:21 localhost dnsmasq-dhcp[317829]: read /var/lib/neutron/dhcp/cce95ca6-6802-4cd1-9e1a-cfe429ad34ae/opts Feb 1 04:57:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:21.987 259320 INFO neutron.agent.dhcp.agent [None req-1c6e8c9e-6add-4ab5-a34f-66fbbfd3c54f - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:22 localhost ovn_controller[152492]: 2026-02-01T09:57:22Z|00320|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:22 localhost dnsmasq[317820]: exiting on receipt of SIGTERM Feb 1 04:57:22 localhost podman[317847]: 2026-02-01 09:57:22.16317211 +0000 UTC m=+0.060503683 container kill bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:57:22 localhost systemd[1]: libpod-bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a.scope: Deactivated successfully. Feb 1 04:57:22 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:22.168 259320 INFO neutron.agent.dhcp.agent [None req-1ff6e665-c6c7-433a-869d-97565c2852b8 - - - - - -] DHCP configuration for ports {'e215e890-4cfd-4a85-97af-6ffb0061420f'} is completed#033[00m Feb 1 04:57:22 localhost nova_compute[274651]: 2026-02-01 09:57:22.173 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:22 localhost podman[317858]: 2026-02-01 09:57:22.251090453 +0000 UTC m=+0.072224102 container died bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:57:22 localhost podman[317858]: 2026-02-01 09:57:22.28444947 +0000 UTC m=+0.105583069 container cleanup bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:22 localhost systemd[1]: libpod-conmon-bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a.scope: Deactivated successfully. Feb 1 04:57:22 localhost podman[317861]: 2026-02-01 09:57:22.327042279 +0000 UTC m=+0.137017465 container remove bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 1 04:57:22 localhost nova_compute[274651]: 2026-02-01 09:57:22.339 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:22 localhost ovn_controller[152492]: 2026-02-01T09:57:22Z|00321|binding|INFO|Releasing lport 59ea6c78-74f4-4621-a6ce-1b4cd5328fba from this chassis (sb_readonly=0) Feb 1 04:57:22 localhost ovn_controller[152492]: 2026-02-01T09:57:22Z|00322|binding|INFO|Setting lport 59ea6c78-74f4-4621-a6ce-1b4cd5328fba down in Southbound Feb 1 04:57:22 localhost kernel: device tap59ea6c78-74 left promiscuous mode Feb 1 04:57:22 localhost nova_compute[274651]: 2026-02-01 09:57:22.362 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.365 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=59ea6c78-74f4-4621-a6ce-1b4cd5328fba) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.366 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 59ea6c78-74f4-4621-a6ce-1b4cd5328fba in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.369 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.370 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[2368cf68-f14f-4f50-a580-185d644c2f23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:22 localhost systemd[1]: var-lib-containers-storage-overlay-f25e32af6f521620c81b86697ac99f7c499db470073fa638945fd8029d330691-merged.mount: Deactivated successfully. Feb 1 04:57:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bae6a4e782a817572ef5818b6bc14b64b8b8ed51e1a62664590882487ac4cb0a-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e157 do_prune osdmap full prune enabled Feb 1 04:57:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e158 e158: 6 total, 6 up, 6 in Feb 1 04:57:22 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e158: 6 total, 6 up, 6 in Feb 1 04:57:22 localhost dnsmasq[317829]: exiting on receipt of SIGTERM Feb 1 04:57:22 localhost podman[317906]: 2026-02-01 09:57:22.461214866 +0000 UTC m=+0.066391573 container kill b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:57:22 localhost systemd[1]: libpod-b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd.scope: Deactivated successfully. Feb 1 04:57:22 localhost podman[317918]: 2026-02-01 09:57:22.530427455 +0000 UTC m=+0.054487837 container died b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:57:22 localhost podman[317918]: 2026-02-01 09:57:22.562614745 +0000 UTC m=+0.086675047 container cleanup b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:57:22 localhost systemd[1]: libpod-conmon-b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd.scope: Deactivated successfully. Feb 1 04:57:22 localhost podman[317920]: 2026-02-01 09:57:22.585451267 +0000 UTC m=+0.100156621 container remove b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:22 localhost nova_compute[274651]: 2026-02-01 09:57:22.632 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:22 localhost ovn_controller[152492]: 2026-02-01T09:57:22Z|00323|binding|INFO|Releasing lport 3e6a397d-2e8b-43e9-8a44-a0db8b907f56 from this chassis (sb_readonly=0) Feb 1 04:57:22 localhost ovn_controller[152492]: 2026-02-01T09:57:22Z|00324|binding|INFO|Setting lport 3e6a397d-2e8b-43e9-8a44-a0db8b907f56 down in Southbound Feb 1 04:57:22 localhost kernel: device tap3e6a397d-2e left promiscuous mode Feb 1 04:57:22 localhost nova_compute[274651]: 2026-02-01 09:57:22.653 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.659 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cce95ca6-6802-4cd1-9e1a-cfe429ad34ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3cb13cb2ee4e4e329cfbfe3e5fc9c8b9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=52fa44b7-34b3-4321-bcfe-6af5d406df00, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3e6a397d-2e8b-43e9-8a44-a0db8b907f56) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.661 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 3e6a397d-2e8b-43e9-8a44-a0db8b907f56 in datapath cce95ca6-6802-4cd1-9e1a-cfe429ad34ae unbound from our chassis#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.663 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cce95ca6-6802-4cd1-9e1a-cfe429ad34ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:22 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:22.664 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[77efd73f-d155-4c7a-8662-09011596efa7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:23.211 259320 INFO neutron.agent.dhcp.agent [None req-170cdbb5-2482-4d40-8431-273b4d23eef6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:23.214 259320 INFO neutron.agent.dhcp.agent [None req-170cdbb5-2482-4d40-8431-273b4d23eef6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:23 localhost systemd[1]: var-lib-containers-storage-overlay-3e00762dc3ac73ff22118eaf8c7b8f456105cffee03746e86e1458f9bbd846a6-merged.mount: Deactivated successfully. Feb 1 04:57:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2af9f0db149d888bd9e79c7f1ff61a64aaf5d0bd76b668dfdcf18c2044728cd-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:23 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:57:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:23.549 259320 INFO neutron.agent.dhcp.agent [None req-5f5a1c60-a314-43cf-9e47-9d62de57a0a2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:23 localhost systemd[1]: run-netns-qdhcp\x2dcce95ca6\x2d6802\x2d4cd1\x2d9e1a\x2dcfe429ad34ae.mount: Deactivated successfully. Feb 1 04:57:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e158 do_prune osdmap full prune enabled Feb 1 04:57:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e159 e159: 6 total, 6 up, 6 in Feb 1 04:57:23 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e159: 6 total, 6 up, 6 in Feb 1 04:57:23 localhost podman[236886]: time="2026-02-01T09:57:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:57:23 localhost podman[236886]: @ - - [01/Feb/2026:09:57:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:57:24 localhost podman[236886]: @ - - [01/Feb/2026:09:57:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18830 "" "Go-http-client/1.1" Feb 1 04:57:24 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:24.558 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:57:24 localhost podman[317947]: 2026-02-01 09:57:24.733023581 +0000 UTC m=+0.085856982 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:24 localhost podman[317947]: 2026-02-01 09:57:24.743217785 +0000 UTC m=+0.096051236 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:24 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:57:24 localhost ovn_controller[152492]: 2026-02-01T09:57:24Z|00325|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:24 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:24.948 2 INFO neutron.agent.securitygroups_rpc [None req-9d70792f-5f72-48f9-b951-877d0761d664 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.017 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:25.076 259320 INFO neutron.agent.linux.ip_lib [None req-64773145-59d3-483a-9edd-296e8621f8d9 - - - - - -] Device tape39f41fe-af cannot be used as it has no MAC address#033[00m Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.100 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost kernel: device tape39f41fe-af entered promiscuous mode Feb 1 04:57:25 localhost NetworkManager[5964]: [1769939845.1114] manager: (tape39f41fe-af): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.111 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost ovn_controller[152492]: 2026-02-01T09:57:25Z|00326|binding|INFO|Claiming lport e39f41fe-affa-4b19-9367-50f3419a4c3a for this chassis. Feb 1 04:57:25 localhost ovn_controller[152492]: 2026-02-01T09:57:25Z|00327|binding|INFO|e39f41fe-affa-4b19-9367-50f3419a4c3a: Claiming unknown Feb 1 04:57:25 localhost systemd-udevd[317976]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:25.125 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:55fe/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e39f41fe-affa-4b19-9367-50f3419a4c3a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:25.127 158365 INFO neutron.agent.ovn.metadata.agent [-] Port e39f41fe-affa-4b19-9367-50f3419a4c3a in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:57:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:25.130 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port eb08811d-eac8-4545-8bc7-9fd8737a430c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:25.130 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:25 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:25.130 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[a1725c8a-1f83-4c26-9f5a-bb5bae9b36b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.144 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost ovn_controller[152492]: 2026-02-01T09:57:25Z|00328|binding|INFO|Setting lport e39f41fe-affa-4b19-9367-50f3419a4c3a ovn-installed in OVS Feb 1 04:57:25 localhost ovn_controller[152492]: 2026-02-01T09:57:25Z|00329|binding|INFO|Setting lport e39f41fe-affa-4b19-9367-50f3419a4c3a up in Southbound Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.148 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost journal[217584]: ethtool ioctl error on tape39f41fe-af: No such device Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.175 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.197 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost nova_compute[274651]: 2026-02-01 09:57:25.244 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:25 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:25.898 2 INFO neutron.agent.securitygroups_rpc [None req-4b206305-83e7-4f57-ba9f-2e24f96d5798 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:26 localhost podman[318047]: Feb 1 04:57:26 localhost podman[318047]: 2026-02-01 09:57:26.123389554 +0000 UTC m=+0.084239762 container create 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:57:26 localhost systemd[1]: Started libpod-conmon-8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634.scope. Feb 1 04:57:26 localhost podman[318047]: 2026-02-01 09:57:26.07771635 +0000 UTC m=+0.038566548 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:26 localhost systemd[1]: Started libcrun container. Feb 1 04:57:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64aac3b0feb8a204ba491f99f3315e0efc44ebf15df2f8e3fb0327cf77adbf8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:26 localhost podman[318047]: 2026-02-01 09:57:26.210495463 +0000 UTC m=+0.171345641 container init 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:57:26 localhost podman[318047]: 2026-02-01 09:57:26.224639398 +0000 UTC m=+0.185489576 container start 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:26 localhost dnsmasq[318065]: started, version 2.85 cachesize 150 Feb 1 04:57:26 localhost dnsmasq[318065]: DNS service limited to local subnets Feb 1 04:57:26 localhost dnsmasq[318065]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:26 localhost dnsmasq[318065]: warning: no upstream servers configured Feb 1 04:57:26 localhost dnsmasq[318065]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:26.262 259320 INFO neutron.agent.dhcp.agent [None req-64773145-59d3-483a-9edd-296e8621f8d9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:24Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c77e5242-98b4-4327-a949-1c5c9046a4fe, ip_allocation=immediate, mac_address=fa:16:3e:8d:eb:b8, name=tempest-NetworksTestDHCPv6-886087410, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['c60c5844-7e2a-4ce0-b46d-be03ccfbba35'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:21Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2177, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:24Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:57:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:26.365 259320 INFO neutron.agent.dhcp.agent [None req-13d960fc-735a-44e2-85be-e0b6a924d9f4 - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:26 localhost dnsmasq[318065]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:57:26 localhost podman[318084]: 2026-02-01 09:57:26.452720004 +0000 UTC m=+0.055920991 container kill 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:57:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:57:26 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:26.712 259320 INFO neutron.agent.dhcp.agent [None req-c67f6e18-3abc-4597-817d-b28a850b5b30 - - - - - -] DHCP configuration for ports {'c77e5242-98b4-4327-a949-1c5c9046a4fe'} is completed#033[00m Feb 1 04:57:26 localhost podman[318122]: 2026-02-01 09:57:26.973838262 +0000 UTC m=+0.060541743 container kill 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:26 localhost dnsmasq[318065]: exiting on receipt of SIGTERM Feb 1 04:57:26 localhost systemd[1]: libpod-8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634.scope: Deactivated successfully. Feb 1 04:57:27 localhost podman[318135]: 2026-02-01 09:57:27.050755258 +0000 UTC m=+0.058108848 container died 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:27 localhost podman[318135]: 2026-02-01 09:57:27.084132024 +0000 UTC m=+0.091485544 container cleanup 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:57:27 localhost systemd[1]: libpod-conmon-8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634.scope: Deactivated successfully. Feb 1 04:57:27 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:27.094 2 INFO neutron.agent.securitygroups_rpc [None req-beee55c0-e969-4dc8-abc8-cdbdc16af93f 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:27 localhost podman[318136]: 2026-02-01 09:57:27.126487677 +0000 UTC m=+0.128511744 container remove 8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:57:27 localhost systemd[1]: tmp-crun.U4X1NI.mount: Deactivated successfully. Feb 1 04:57:27 localhost systemd[1]: var-lib-containers-storage-overlay-64aac3b0feb8a204ba491f99f3315e0efc44ebf15df2f8e3fb0327cf77adbf8a-merged.mount: Deactivated successfully. Feb 1 04:57:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8194c952d18ea70d649857dd745ec1a9b55b75bf2d2c0236530630872c922634-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:27 localhost nova_compute[274651]: 2026-02-01 09:57:27.144 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:27 localhost kernel: device tape39f41fe-af left promiscuous mode Feb 1 04:57:27 localhost ovn_controller[152492]: 2026-02-01T09:57:27Z|00330|binding|INFO|Releasing lport e39f41fe-affa-4b19-9367-50f3419a4c3a from this chassis (sb_readonly=0) Feb 1 04:57:27 localhost ovn_controller[152492]: 2026-02-01T09:57:27Z|00331|binding|INFO|Setting lport e39f41fe-affa-4b19-9367-50f3419a4c3a down in Southbound Feb 1 04:57:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:27.157 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4e:55fe/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e39f41fe-affa-4b19-9367-50f3419a4c3a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:27.159 158365 INFO neutron.agent.ovn.metadata.agent [-] Port e39f41fe-affa-4b19-9367-50f3419a4c3a in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:57:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:27.162 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:27.164 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[30fa0089-b27c-43b8-81d1-a856daa75375]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:27 localhost nova_compute[274651]: 2026-02-01 09:57:27.165 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:27 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:57:27 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:27.527 259320 INFO neutron.agent.dhcp.agent [None req-11c20792-c6a8-47fb-a5c3-7dfec98a30e0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:27 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:27.528 259320 INFO neutron.agent.dhcp.agent [None req-11c20792-c6a8-47fb-a5c3-7dfec98a30e0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:28 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:28.139 2 INFO neutron.agent.securitygroups_rpc [None req-b32fd076-8c86-4555-94ea-b4066e09ed5c e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e159 do_prune osdmap full prune enabled Feb 1 04:57:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e160 e160: 6 total, 6 up, 6 in Feb 1 04:57:28 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e160: 6 total, 6 up, 6 in Feb 1 04:57:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:28.557 259320 INFO neutron.agent.linux.ip_lib [None req-50ee5306-b39b-4189-912c-62f2cc624387 - - - - - -] Device tap9ccafd1c-ee cannot be used as it has no MAC address#033[00m Feb 1 04:57:28 localhost nova_compute[274651]: 2026-02-01 09:57:28.582 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:28 localhost kernel: device tap9ccafd1c-ee entered promiscuous mode Feb 1 04:57:28 localhost NetworkManager[5964]: [1769939848.5895] manager: (tap9ccafd1c-ee): new Generic device (/org/freedesktop/NetworkManager/Devices/58) Feb 1 04:57:28 localhost nova_compute[274651]: 2026-02-01 09:57:28.590 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:28 localhost ovn_controller[152492]: 2026-02-01T09:57:28Z|00332|binding|INFO|Claiming lport 9ccafd1c-ee51-4452-9d08-60f372986507 for this chassis. Feb 1 04:57:28 localhost ovn_controller[152492]: 2026-02-01T09:57:28Z|00333|binding|INFO|9ccafd1c-ee51-4452-9d08-60f372986507: Claiming unknown Feb 1 04:57:28 localhost systemd-udevd[318177]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:28 localhost ovn_controller[152492]: 2026-02-01T09:57:28Z|00334|binding|INFO|Setting lport 9ccafd1c-ee51-4452-9d08-60f372986507 ovn-installed in OVS Feb 1 04:57:28 localhost nova_compute[274651]: 2026-02-01 09:57:28.604 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:28 localhost ovn_controller[152492]: 2026-02-01T09:57:28Z|00335|binding|INFO|Setting lport 9ccafd1c-ee51-4452-9d08-60f372986507 up in Southbound Feb 1 04:57:28 localhost nova_compute[274651]: 2026-02-01 09:57:28.608 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:28.613 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee8:9d71/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ccafd1c-ee51-4452-9d08-60f372986507) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:28.616 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 9ccafd1c-ee51-4452-9d08-60f372986507 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:57:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:28.619 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 0fd14ad5-0b37-42b4-a3d9-8aceaf718a0e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:28.620 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:28.620 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[423f288c-4751-4e55-86be-7ed63b1c685e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost nova_compute[274651]: 2026-02-01 09:57:28.629 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost journal[217584]: ethtool ioctl error on tap9ccafd1c-ee: No such device Feb 1 04:57:28 localhost nova_compute[274651]: 2026-02-01 09:57:28.669 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:28 localhost nova_compute[274651]: 2026-02-01 09:57:28.698 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:29 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:29.095 2 INFO neutron.agent.securitygroups_rpc [None req-2d3763ba-9699-499d-861c-79c864912ba7 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:29 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:29.197 2 INFO neutron.agent.securitygroups_rpc [None req-2d3763ba-9699-499d-861c-79c864912ba7 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:29 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:29.433 2 INFO neutron.agent.securitygroups_rpc [None req-2958eafe-4222-42a5-8f58-07ed853ce57d e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:29 localhost podman[318248]: Feb 1 04:57:29 localhost podman[318248]: 2026-02-01 09:57:29.540359981 +0000 UTC m=+0.082339463 container create d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:57:29 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:29.557 2 INFO neutron.agent.securitygroups_rpc [None req-dc57259b-517e-469b-bbab-e176f8bbf6e4 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:29 localhost systemd[1]: Started libpod-conmon-d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f.scope. Feb 1 04:57:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:29.576 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:29 localhost systemd[1]: Started libcrun container. Feb 1 04:57:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f633c7bbd2b713b5ae1d5657d0044052cd44aac3d0783ede0cb7cce7e0fc29d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:29 localhost podman[318248]: 2026-02-01 09:57:29.501558067 +0000 UTC m=+0.043537599 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:29 localhost podman[318248]: 2026-02-01 09:57:29.604682119 +0000 UTC m=+0.146661571 container init d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:29 localhost podman[318248]: 2026-02-01 09:57:29.609346593 +0000 UTC m=+0.151326035 container start d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:57:29 localhost dnsmasq[318267]: started, version 2.85 cachesize 150 Feb 1 04:57:29 localhost dnsmasq[318267]: DNS service limited to local subnets Feb 1 04:57:29 localhost dnsmasq[318267]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:29 localhost dnsmasq[318267]: warning: no upstream servers configured Feb 1 04:57:29 localhost dnsmasq-dhcp[318267]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:29 localhost dnsmasq[318267]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:29 localhost dnsmasq-dhcp[318267]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:29 localhost dnsmasq-dhcp[318267]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:29.650 259320 INFO neutron.agent.dhcp.agent [None req-50ee5306-b39b-4189-912c-62f2cc624387 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:27Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=83ca6cd4-04d7-4ee9-a5dd-95da15b72ca1, ip_allocation=immediate, mac_address=fa:16:3e:6b:76:77, name=tempest-NetworksTestDHCPv6-918705313, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['cd75cad0-5db1-47b2-806d-28236e9df89e'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:26Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2192, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:27Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:57:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:29.797 259320 INFO neutron.agent.dhcp.agent [None req-c794efbd-2595-4ecd-8f01-3d5dac30e89c - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:29 localhost dnsmasq[318267]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:57:29 localhost dnsmasq-dhcp[318267]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:29 localhost dnsmasq-dhcp[318267]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:29 localhost podman[318286]: 2026-02-01 09:57:29.828094081 +0000 UTC m=+0.070787349 container kill d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:57:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:30.131 259320 INFO neutron.agent.dhcp.agent [None req-1143f27b-1db5-4161-85d0-b89bde8d3951 - - - - - -] DHCP configuration for ports {'83ca6cd4-04d7-4ee9-a5dd-95da15b72ca1'} is completed#033[00m Feb 1 04:57:30 localhost nova_compute[274651]: 2026-02-01 09:57:30.289 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:30 localhost nova_compute[274651]: 2026-02-01 09:57:30.293 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e160 do_prune osdmap full prune enabled Feb 1 04:57:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e161 e161: 6 total, 6 up, 6 in Feb 1 04:57:30 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e161: 6 total, 6 up, 6 in Feb 1 04:57:30 localhost dnsmasq[318267]: exiting on receipt of SIGTERM Feb 1 04:57:30 localhost podman[318321]: 2026-02-01 09:57:30.426590359 +0000 UTC m=+0.071711636 container kill d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:30 localhost systemd[1]: libpod-d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f.scope: Deactivated successfully. Feb 1 04:57:30 localhost podman[318335]: 2026-02-01 09:57:30.484850511 +0000 UTC m=+0.049223365 container died d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:30 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:30.493 2 INFO neutron.agent.securitygroups_rpc [None req-a9710b04-3715-469f-ada4-600e33182b7e 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:30 localhost systemd[1]: var-lib-containers-storage-overlay-f633c7bbd2b713b5ae1d5657d0044052cd44aac3d0783ede0cb7cce7e0fc29d6-merged.mount: Deactivated successfully. Feb 1 04:57:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:30 localhost podman[318335]: 2026-02-01 09:57:30.560730104 +0000 UTC m=+0.125102938 container cleanup d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:30 localhost systemd[1]: libpod-conmon-d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f.scope: Deactivated successfully. Feb 1 04:57:30 localhost podman[318343]: 2026-02-01 09:57:30.586538878 +0000 UTC m=+0.137980585 container remove d76ea1699b76146b488f7c12b9f3c65c8846ceb058bfbfa2d963b6026795d04f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:30 localhost nova_compute[274651]: 2026-02-01 09:57:30.600 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:30 localhost kernel: device tap9ccafd1c-ee left promiscuous mode Feb 1 04:57:30 localhost ovn_controller[152492]: 2026-02-01T09:57:30Z|00336|binding|INFO|Releasing lport 9ccafd1c-ee51-4452-9d08-60f372986507 from this chassis (sb_readonly=0) Feb 1 04:57:30 localhost ovn_controller[152492]: 2026-02-01T09:57:30Z|00337|binding|INFO|Setting lport 9ccafd1c-ee51-4452-9d08-60f372986507 down in Southbound Feb 1 04:57:30 localhost nova_compute[274651]: 2026-02-01 09:57:30.625 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:30.627 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fee8:9d71/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ccafd1c-ee51-4452-9d08-60f372986507) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:30.629 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 9ccafd1c-ee51-4452-9d08-60f372986507 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:57:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:30.632 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:30 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:30.633 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d17ebf52-631a-45a4-a48f-ff9380cc43ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:31 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:57:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e161 do_prune osdmap full prune enabled Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0. Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.432044) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55 Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851432096, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2371, "num_deletes": 264, "total_data_size": 2688028, "memory_usage": 2750552, "flush_reason": "Manual Compaction"} Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started Feb 1 04:57:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e162 e162: 6 total, 6 up, 6 in Feb 1 04:57:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e162: 6 total, 6 up, 6 in Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851446268, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2595391, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29077, "largest_seqno": 31447, "table_properties": {"data_size": 2585328, "index_size": 6440, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 21714, "raw_average_key_size": 21, "raw_value_size": 2564997, "raw_average_value_size": 2562, "num_data_blocks": 273, "num_entries": 1001, "num_filter_entries": 1001, "num_deletions": 264, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939696, "oldest_key_time": 1769939696, "file_creation_time": 1769939851, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}} Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 14265 microseconds, and 6500 cpu microseconds. Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.446314) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2595391 bytes OK Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.446338) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.448636) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.448656) EVENT_LOG_v1 {"time_micros": 1769939851448650, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.448678) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 2677995, prev total WAL file size 2678036, number of live WAL files 2. Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.449487) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2534KB)], [54(18MB)] Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851449568, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 21711985, "oldest_snapshot_seqno": -1} Feb 1 04:57:31 localhost openstack_network_exporter[239441]: ERROR 09:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:57:31 localhost openstack_network_exporter[239441]: Feb 1 04:57:31 localhost openstack_network_exporter[239441]: ERROR 09:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:57:31 localhost openstack_network_exporter[239441]: Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 12652 keys, 19460727 bytes, temperature: kUnknown Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851539873, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 19460727, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19389633, "index_size": 38353, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31685, "raw_key_size": 341134, "raw_average_key_size": 26, "raw_value_size": 19174987, "raw_average_value_size": 1515, "num_data_blocks": 1437, "num_entries": 12652, "num_filter_entries": 12652, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939851, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}} Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.540412) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 19460727 bytes Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.543600) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 239.5 rd, 214.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 18.2 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(15.9) write-amplify(7.5) OK, records in: 13195, records dropped: 543 output_compression: NoCompression Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.543622) EVENT_LOG_v1 {"time_micros": 1769939851543613, "job": 32, "event": "compaction_finished", "compaction_time_micros": 90661, "compaction_time_cpu_micros": 34725, "output_level": 6, "num_output_files": 1, "total_output_size": 19460727, "num_input_records": 13195, "num_output_records": 12652, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851543954, "job": 32, "event": "table_file_deletion", "file_number": 56} Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851545918, "job": 32, "event": "table_file_deletion", "file_number": 54} Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.449382) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.546426) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.546436) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.546441) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.546448) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:57:31 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:57:31.546453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:57:31 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:31.556 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 1 04:57:31 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:31.796 2 INFO neutron.agent.securitygroups_rpc [None req-c82c049d-951f-4fb2-91dc-f79774ee784c e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:31 localhost nova_compute[274651]: 2026-02-01 09:57:31.935 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:31.935 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:31 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:31.937 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:57:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:57:32 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:32.331 2 INFO neutron.agent.securitygroups_rpc [None req-ae637900-6d4f-4914-b5d3-eacda6bf763f e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:32 localhost systemd[1]: tmp-crun.vYbapM.mount: Deactivated successfully. Feb 1 04:57:32 localhost podman[318368]: 2026-02-01 09:57:32.423271661 +0000 UTC m=+0.087039578 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:57:32 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:32.434 259320 INFO neutron.agent.linux.ip_lib [None req-59e09553-a8bc-477e-8f97-db2b8bba7369 - - - - - -] Device tap390b69cd-dd cannot be used as it has no MAC address#033[00m Feb 1 04:57:32 localhost podman[318368]: 2026-02-01 09:57:32.436369175 +0000 UTC m=+0.100137122 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 04:57:32 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:57:32 localhost nova_compute[274651]: 2026-02-01 09:57:32.460 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:32 localhost kernel: device tap390b69cd-dd entered promiscuous mode Feb 1 04:57:32 localhost NetworkManager[5964]: [1769939852.4679] manager: (tap390b69cd-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/59) Feb 1 04:57:32 localhost ovn_controller[152492]: 2026-02-01T09:57:32Z|00338|binding|INFO|Claiming lport 390b69cd-dd37-4979-8a69-c659caca50f4 for this chassis. Feb 1 04:57:32 localhost ovn_controller[152492]: 2026-02-01T09:57:32Z|00339|binding|INFO|390b69cd-dd37-4979-8a69-c659caca50f4: Claiming unknown Feb 1 04:57:32 localhost nova_compute[274651]: 2026-02-01 09:57:32.471 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:32 localhost systemd-udevd[318399]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:32 localhost ovn_controller[152492]: 2026-02-01T09:57:32Z|00340|binding|INFO|Setting lport 390b69cd-dd37-4979-8a69-c659caca50f4 ovn-installed in OVS Feb 1 04:57:32 localhost ovn_controller[152492]: 2026-02-01T09:57:32Z|00341|binding|INFO|Setting lport 390b69cd-dd37-4979-8a69-c659caca50f4 up in Southbound Feb 1 04:57:32 localhost nova_compute[274651]: 2026-02-01 09:57:32.478 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:32 localhost nova_compute[274651]: 2026-02-01 09:57:32.480 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:32.477 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=390b69cd-dd37-4979-8a69-c659caca50f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:32.480 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 390b69cd-dd37-4979-8a69-c659caca50f4 in datapath cba39058-6a05-4f77-add1-57334b728a66 bound to our chassis#033[00m Feb 1 04:57:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:32.483 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6f7c1382-daf6-4dec-a5d4-3c7b7a8626eb IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:32.484 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:32 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:32.486 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[efc4e50b-e1d3-40f1-9bab-f1e55ef6bc38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:32 localhost nova_compute[274651]: 2026-02-01 09:57:32.506 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:32 localhost nova_compute[274651]: 2026-02-01 09:57:32.540 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:32 localhost nova_compute[274651]: 2026-02-01 09:57:32.602 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:57:32 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1595370341' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:57:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:57:32 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1595370341' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:57:33 localhost podman[318454]: Feb 1 04:57:33 localhost podman[318454]: 2026-02-01 09:57:33.409300979 +0000 UTC m=+0.074730490 container create 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:57:33 localhost systemd[1]: Started libpod-conmon-461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6.scope. Feb 1 04:57:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e162 do_prune osdmap full prune enabled Feb 1 04:57:33 localhost systemd[1]: tmp-crun.1flKK3.mount: Deactivated successfully. Feb 1 04:57:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e163 e163: 6 total, 6 up, 6 in Feb 1 04:57:33 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e163: 6 total, 6 up, 6 in Feb 1 04:57:33 localhost systemd[1]: Started libcrun container. Feb 1 04:57:33 localhost podman[318454]: 2026-02-01 09:57:33.371195246 +0000 UTC m=+0.036624787 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2195f112e4798c5bf35513963e5faefea0c859066ac2701daf8740e884d6c49f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:33 localhost podman[318454]: 2026-02-01 09:57:33.489232418 +0000 UTC m=+0.154661959 container init 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:33 localhost podman[318454]: 2026-02-01 09:57:33.500213435 +0000 UTC m=+0.165642986 container start 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:33 localhost dnsmasq[318472]: started, version 2.85 cachesize 150 Feb 1 04:57:33 localhost dnsmasq[318472]: DNS service limited to local subnets Feb 1 04:57:33 localhost dnsmasq[318472]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:33 localhost dnsmasq[318472]: warning: no upstream servers configured Feb 1 04:57:33 localhost dnsmasq-dhcp[318472]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:33 localhost dnsmasq[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:33 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:33 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:33 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:33.790 259320 INFO neutron.agent.dhcp.agent [None req-0ef0e619-a714-42c7-876e-93424f65fb9d - - - - - -] DHCP configuration for ports {'d4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:33 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:33.808 259320 INFO neutron.agent.linux.ip_lib [None req-874efc2a-896e-447e-b665-23119c8ad11a - - - - - -] Device tap2e5ad63a-c6 cannot be used as it has no MAC address#033[00m Feb 1 04:57:33 localhost nova_compute[274651]: 2026-02-01 09:57:33.861 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:33 localhost kernel: device tap2e5ad63a-c6 entered promiscuous mode Feb 1 04:57:33 localhost NetworkManager[5964]: [1769939853.8671] manager: (tap2e5ad63a-c6): new Generic device (/org/freedesktop/NetworkManager/Devices/60) Feb 1 04:57:33 localhost ovn_controller[152492]: 2026-02-01T09:57:33Z|00342|binding|INFO|Claiming lport 2e5ad63a-c625-47e9-9640-0e5b1fe7e73c for this chassis. Feb 1 04:57:33 localhost ovn_controller[152492]: 2026-02-01T09:57:33Z|00343|binding|INFO|2e5ad63a-c625-47e9-9640-0e5b1fe7e73c: Claiming unknown Feb 1 04:57:33 localhost nova_compute[274651]: 2026-02-01 09:57:33.870 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:33.892 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-206ee553-cd65-4f77-8129-196aa5aa2858', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206ee553-cd65-4f77-8129-196aa5aa2858', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1845ff11-b6de-4916-9124-41d0eb3eeed7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2e5ad63a-c625-47e9-9640-0e5b1fe7e73c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:33.895 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 2e5ad63a-c625-47e9-9640-0e5b1fe7e73c in datapath 206ee553-cd65-4f77-8129-196aa5aa2858 bound to our chassis#033[00m Feb 1 04:57:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:33.900 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 103498f0-64f3-4617-989d-484a16400d18 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:33.901 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 206ee553-cd65-4f77-8129-196aa5aa2858, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:33 localhost ovn_controller[152492]: 2026-02-01T09:57:33Z|00344|binding|INFO|Setting lport 2e5ad63a-c625-47e9-9640-0e5b1fe7e73c ovn-installed in OVS Feb 1 04:57:33 localhost ovn_controller[152492]: 2026-02-01T09:57:33Z|00345|binding|INFO|Setting lport 2e5ad63a-c625-47e9-9640-0e5b1fe7e73c up in Southbound Feb 1 04:57:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:33.902 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[072c1869-8895-47da-a5f3-520a86c3def1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:33 localhost nova_compute[274651]: 2026-02-01 09:57:33.905 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:33 localhost nova_compute[274651]: 2026-02-01 09:57:33.941 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:33 localhost nova_compute[274651]: 2026-02-01 09:57:33.963 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:33 localhost dnsmasq[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:33 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:33 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:33 localhost podman[318507]: 2026-02-01 09:57:33.998188491 +0000 UTC m=+0.046330185 container kill 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:34.098 259320 INFO neutron.agent.dhcp.agent [None req-9316ef31-d075-4151-a5ef-a10f9046294b - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:31Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bdd25147-ae9d-4c38-9cd7-d303709fd872, ip_allocation=immediate, mac_address=fa:16:3e:4d:75:de, name=tempest-NetworksTestDHCPv6-2099454027, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['185f713c-d431-45ef-aad4-fe5fb8bcbd86'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:30Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2226, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:31Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:57:34 localhost dnsmasq[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 1 addresses Feb 1 04:57:34 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:34 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:34 localhost podman[318556]: 2026-02-01 09:57:34.269342081 +0000 UTC m=+0.066578309 container kill 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:57:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:34.516 259320 INFO neutron.agent.dhcp.agent [None req-81c38ee2-8291-4135-965f-38bbfa816eb7 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:57:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2965659129' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:57:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:57:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2965659129' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:57:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:34.657 259320 INFO neutron.agent.dhcp.agent [None req-58e97748-b932-4c86-8650-7f229b67a5e9 - - - - - -] DHCP configuration for ports {'bdd25147-ae9d-4c38-9cd7-d303709fd872'} is completed#033[00m Feb 1 04:57:34 localhost dnsmasq[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:34 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:34 localhost podman[318607]: 2026-02-01 09:57:34.711698627 +0000 UTC m=+0.075856554 container kill 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:57:34 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:34 localhost podman[318646]: Feb 1 04:57:34 localhost podman[318646]: 2026-02-01 09:57:34.928781644 +0000 UTC m=+0.077202795 container create b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:34 localhost systemd[1]: Started libpod-conmon-b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31.scope. Feb 1 04:57:34 localhost podman[318646]: 2026-02-01 09:57:34.881565482 +0000 UTC m=+0.029986643 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:34 localhost systemd[1]: Started libcrun container. Feb 1 04:57:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4974c31cf4b950157ff5f0f1f244fbde87227d444e5094de9dfb7675012ed750/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:35 localhost podman[318646]: 2026-02-01 09:57:35.002664347 +0000 UTC m=+0.151085458 container init b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:57:35 localhost podman[318646]: 2026-02-01 09:57:35.011384175 +0000 UTC m=+0.159805296 container start b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:57:35 localhost dnsmasq[318665]: started, version 2.85 cachesize 150 Feb 1 04:57:35 localhost dnsmasq[318665]: DNS service limited to local subnets Feb 1 04:57:35 localhost dnsmasq[318665]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:35 localhost dnsmasq[318665]: warning: no upstream servers configured Feb 1 04:57:35 localhost dnsmasq-dhcp[318665]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:57:35 localhost dnsmasq[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/addn_hosts - 0 addresses Feb 1 04:57:35 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/host Feb 1 04:57:35 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/opts Feb 1 04:57:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:35.230 259320 INFO neutron.agent.dhcp.agent [None req-436698be-96c3-4896-898a-a64d7c62004d - - - - - -] DHCP configuration for ports {'e648f4de-83a6-4550-b2fe-bf7ca302fb07'} is completed#033[00m Feb 1 04:57:35 localhost nova_compute[274651]: 2026-02-01 09:57:35.292 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:57:35 localhost podman[318666]: 2026-02-01 09:57:35.726114698 +0000 UTC m=+0.082620922 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:35 localhost podman[318666]: 2026-02-01 09:57:35.760427983 +0000 UTC m=+0.116934227 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0) Feb 1 04:57:35 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:57:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e163 do_prune osdmap full prune enabled Feb 1 04:57:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e164 e164: 6 total, 6 up, 6 in Feb 1 04:57:36 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e164: 6 total, 6 up, 6 in Feb 1 04:57:36 localhost systemd[1]: tmp-crun.vX4V2G.mount: Deactivated successfully. Feb 1 04:57:36 localhost dnsmasq[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:36 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:36 localhost dnsmasq-dhcp[318472]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:36 localhost podman[318701]: 2026-02-01 09:57:36.612254683 +0000 UTC m=+0.065841446 container kill 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:57:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:36.624 259320 INFO neutron.agent.linux.ip_lib [None req-cf4b14b6-6863-4e1b-9253-eb19f9243132 - - - - - -] Device tap255c4aaa-d3 cannot be used as it has no MAC address#033[00m Feb 1 04:57:36 localhost nova_compute[274651]: 2026-02-01 09:57:36.648 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:36 localhost kernel: device tap255c4aaa-d3 entered promiscuous mode Feb 1 04:57:36 localhost NetworkManager[5964]: [1769939856.6531] manager: (tap255c4aaa-d3): new Generic device (/org/freedesktop/NetworkManager/Devices/61) Feb 1 04:57:36 localhost nova_compute[274651]: 2026-02-01 09:57:36.653 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:36 localhost ovn_controller[152492]: 2026-02-01T09:57:36Z|00346|binding|INFO|Claiming lport 255c4aaa-d3b5-49f2-a806-fad4e551d461 for this chassis. Feb 1 04:57:36 localhost ovn_controller[152492]: 2026-02-01T09:57:36Z|00347|binding|INFO|255c4aaa-d3b5-49f2-a806-fad4e551d461: Claiming unknown Feb 1 04:57:36 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:36.665 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0346851d-fcce-43f5-93ab-c0abe62c3f50, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=255c4aaa-d3b5-49f2-a806-fad4e551d461) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:36 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:36.666 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 255c4aaa-d3b5-49f2-a806-fad4e551d461 in datapath 1dd45b6e-c7d4-4daf-9d1e-f7981179cd48 bound to our chassis#033[00m Feb 1 04:57:36 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:36.667 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1dd45b6e-c7d4-4daf-9d1e-f7981179cd48 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:36 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:36.668 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[9e22be52-b4ba-4caa-bbdc-c95530262990]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:36 localhost nova_compute[274651]: 2026-02-01 09:57:36.669 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:36 localhost ovn_controller[152492]: 2026-02-01T09:57:36Z|00348|binding|INFO|Setting lport 255c4aaa-d3b5-49f2-a806-fad4e551d461 ovn-installed in OVS Feb 1 04:57:36 localhost ovn_controller[152492]: 2026-02-01T09:57:36Z|00349|binding|INFO|Setting lport 255c4aaa-d3b5-49f2-a806-fad4e551d461 up in Southbound Feb 1 04:57:36 localhost nova_compute[274651]: 2026-02-01 09:57:36.675 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:57:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e164 do_prune osdmap full prune enabled Feb 1 04:57:36 localhost nova_compute[274651]: 2026-02-01 09:57:36.693 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e165 e165: 6 total, 6 up, 6 in Feb 1 04:57:36 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e165: 6 total, 6 up, 6 in Feb 1 04:57:36 localhost nova_compute[274651]: 2026-02-01 09:57:36.733 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:36 localhost nova_compute[274651]: 2026-02-01 09:57:36.755 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:37 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:37.010 259320 INFO neutron.agent.dhcp.agent [None req-f690acfb-57b0-4581-90d4-3ed7c7ec1e82 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:37 localhost podman[318783]: Feb 1 04:57:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:57:37 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2288622633' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:57:37 localhost podman[318783]: 2026-02-01 09:57:37.677334852 +0000 UTC m=+0.089865826 container create 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:57:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:57:37 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2288622633' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:57:37 localhost podman[318783]: 2026-02-01 09:57:37.626105656 +0000 UTC m=+0.038636660 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:37 localhost systemd[1]: Started libpod-conmon-25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b.scope. Feb 1 04:57:37 localhost systemd[1]: Started libcrun container. Feb 1 04:57:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/316d1968581b430d9df905c860c1fa54c3ac33696fc592c9950b148ea6fddc0b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:37 localhost podman[318783]: 2026-02-01 09:57:37.764432751 +0000 UTC m=+0.176963725 container init 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:57:37 localhost podman[318783]: 2026-02-01 09:57:37.77351974 +0000 UTC m=+0.186050714 container start 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:57:37 localhost dnsmasq[318836]: started, version 2.85 cachesize 150 Feb 1 04:57:37 localhost dnsmasq[318836]: DNS service limited to local subnets Feb 1 04:57:37 localhost dnsmasq[318836]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:37 localhost dnsmasq[318836]: warning: no upstream servers configured Feb 1 04:57:37 localhost dnsmasq-dhcp[318836]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:57:37 localhost dnsmasq[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/addn_hosts - 0 addresses Feb 1 04:57:37 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/host Feb 1 04:57:37 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/opts Feb 1 04:57:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:37.806 2 INFO neutron.agent.securitygroups_rpc [None req-0302dcee-a94e-448d-b9a1-97eb07e05bc2 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:38.038 259320 INFO neutron.agent.dhcp.agent [None req-49ee2b81-15e8-4075-aa2a-15702abdf4b0 - - - - - -] DHCP configuration for ports {'ea68799e-a797-45aa-a833-84afa6ff1a93'} is completed#033[00m Feb 1 04:57:38 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:38.251 2 INFO neutron.agent.securitygroups_rpc [None req-56c81e71-4e5b-4ed6-b56f-ca0ee463b60f 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']#033[00m Feb 1 04:57:38 localhost dnsmasq[318472]: exiting on receipt of SIGTERM Feb 1 04:57:38 localhost systemd[1]: libpod-461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6.scope: Deactivated successfully. Feb 1 04:57:38 localhost podman[318885]: 2026-02-01 09:57:38.446050015 +0000 UTC m=+0.045991785 container kill 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:38 localhost podman[318901]: 2026-02-01 09:57:38.502286605 +0000 UTC m=+0.042619532 container died 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:57:38 localhost podman[318901]: 2026-02-01 09:57:38.529122701 +0000 UTC m=+0.069455608 container cleanup 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:57:38 localhost systemd[1]: libpod-conmon-461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6.scope: Deactivated successfully. Feb 1 04:57:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:38.555 259320 INFO neutron.agent.linux.ip_lib [None req-9a0d6a49-7c25-4430-bfbb-b78b732de46c - - - - - -] Device tapb1198584-6c cannot be used as it has no MAC address#033[00m Feb 1 04:57:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:57:38 localhost nova_compute[274651]: 2026-02-01 09:57:38.574 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:57:38 localhost kernel: device tapb1198584-6c entered promiscuous mode Feb 1 04:57:38 localhost NetworkManager[5964]: [1769939858.5834] manager: (tapb1198584-6c): new Generic device (/org/freedesktop/NetworkManager/Devices/62) Feb 1 04:57:38 localhost ovn_controller[152492]: 2026-02-01T09:57:38Z|00350|binding|INFO|Claiming lport b1198584-6cdd-4525-82cb-83748a05c365 for this chassis. Feb 1 04:57:38 localhost ovn_controller[152492]: 2026-02-01T09:57:38Z|00351|binding|INFO|b1198584-6cdd-4525-82cb-83748a05c365: Claiming unknown Feb 1 04:57:38 localhost nova_compute[274651]: 2026-02-01 09:57:38.584 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:38 localhost podman[318903]: 2026-02-01 09:57:38.591496299 +0000 UTC m=+0.129546995 container remove 461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:57:38 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:57:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:38.600 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-aa1930be-537d-42cd-9add-4ed7ae12f537', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa1930be-537d-42cd-9add-4ed7ae12f537', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a0163ab-36a7-43cd-9755-2c82aefcb8b7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b1198584-6cdd-4525-82cb-83748a05c365) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:38.602 158365 INFO neutron.agent.ovn.metadata.agent [-] Port b1198584-6cdd-4525-82cb-83748a05c365 in datapath aa1930be-537d-42cd-9add-4ed7ae12f537 bound to our chassis#033[00m Feb 1 04:57:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:38.603 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network aa1930be-537d-42cd-9add-4ed7ae12f537 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:38 localhost ovn_controller[152492]: 2026-02-01T09:57:38Z|00352|binding|INFO|Setting lport b1198584-6cdd-4525-82cb-83748a05c365 ovn-installed in OVS Feb 1 04:57:38 localhost ovn_controller[152492]: 2026-02-01T09:57:38Z|00353|binding|INFO|Setting lport b1198584-6cdd-4525-82cb-83748a05c365 up in Southbound Feb 1 04:57:38 localhost nova_compute[274651]: 2026-02-01 09:57:38.605 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:38 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:38.604 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[e26d1ee4-3464-490f-ab1a-70d4eb7b3631]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:38 localhost nova_compute[274651]: 2026-02-01 09:57:38.614 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:38 localhost nova_compute[274651]: 2026-02-01 09:57:38.652 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:38 localhost nova_compute[274651]: 2026-02-01 09:57:38.673 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:38 localhost systemd[1]: tmp-crun.OJkdyC.mount: Deactivated successfully. Feb 1 04:57:38 localhost systemd[1]: var-lib-containers-storage-overlay-2195f112e4798c5bf35513963e5faefea0c859066ac2701daf8740e884d6c49f-merged.mount: Deactivated successfully. Feb 1 04:57:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-461dd5f331fa693aaa6f48c58497de534d140444b4fc119d045e6887cc63ace6-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:39 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:39.355 2 INFO neutron.agent.securitygroups_rpc [None req-e6670585-76cd-441d-8f68-6f14a6f35b07 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:39 localhost podman[319032]: Feb 1 04:57:39 localhost podman[319032]: 2026-02-01 09:57:39.539245209 +0000 UTC m=+0.090032900 container create f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aa1930be-537d-42cd-9add-4ed7ae12f537, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 1 04:57:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:57:39 localhost systemd[1]: Started libpod-conmon-f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225.scope. Feb 1 04:57:39 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:57:39 localhost podman[319032]: 2026-02-01 09:57:39.506164501 +0000 UTC m=+0.056952192 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:39 localhost systemd[1]: Started libcrun container. Feb 1 04:57:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c229a2e839d790424f080d319544b54e123dbed478734c64ddc837e6deb88e18/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:39 localhost podman[319053]: 2026-02-01 09:57:39.661351965 +0000 UTC m=+0.073955456 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:57:39 localhost podman[319053]: 2026-02-01 09:57:39.665549423 +0000 UTC m=+0.078152944 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:57:39 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:57:39 localhost podman[319032]: 2026-02-01 09:57:39.690223693 +0000 UTC m=+0.241011394 container init f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aa1930be-537d-42cd-9add-4ed7ae12f537, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:57:39 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:39.694 2 INFO neutron.agent.securitygroups_rpc [None req-d6e67e7e-f7e2-4516-bf2c-7113ac674e15 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:39 localhost podman[319032]: 2026-02-01 09:57:39.697667952 +0000 UTC m=+0.248455643 container start f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aa1930be-537d-42cd-9add-4ed7ae12f537, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:57:39 localhost dnsmasq[319084]: started, version 2.85 cachesize 150 Feb 1 04:57:39 localhost dnsmasq[319084]: DNS service limited to local subnets Feb 1 04:57:39 localhost dnsmasq[319084]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:39 localhost dnsmasq[319084]: warning: no upstream servers configured Feb 1 04:57:39 localhost dnsmasq-dhcp[319084]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:57:39 localhost dnsmasq[319084]: read /var/lib/neutron/dhcp/aa1930be-537d-42cd-9add-4ed7ae12f537/addn_hosts - 0 addresses Feb 1 04:57:39 localhost dnsmasq-dhcp[319084]: read /var/lib/neutron/dhcp/aa1930be-537d-42cd-9add-4ed7ae12f537/host Feb 1 04:57:39 localhost dnsmasq-dhcp[319084]: read /var/lib/neutron/dhcp/aa1930be-537d-42cd-9add-4ed7ae12f537/opts Feb 1 04:57:39 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:39.817 259320 INFO neutron.agent.dhcp.agent [None req-8fe45551-13d6-4e9c-8a0b-e86d2d5ef2a2 - - - - - -] DHCP configuration for ports {'9df41d18-9f8d-41fa-b55b-a799e1167254'} is completed#033[00m Feb 1 04:57:40 localhost podman[319107]: Feb 1 04:57:40 localhost podman[319107]: 2026-02-01 09:57:40.058741018 +0000 UTC m=+0.092163746 container create f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:57:40 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:40.089 2 INFO neutron.agent.securitygroups_rpc [None req-1328bdc5-e5a5-40ba-b48b-34d65147d68f afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group rule updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']#033[00m Feb 1 04:57:40 localhost systemd[1]: Started libpod-conmon-f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78.scope. Feb 1 04:57:40 localhost podman[319107]: 2026-02-01 09:57:40.014825927 +0000 UTC m=+0.048248665 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:40 localhost systemd[1]: Started libcrun container. Feb 1 04:57:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13ac61d11e921e99b96ebfcddd3f3dd8289970251dc0bb4e0e552de48728930a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:40 localhost podman[319107]: 2026-02-01 09:57:40.134460256 +0000 UTC m=+0.167882984 container init f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:57:40 localhost podman[319107]: 2026-02-01 09:57:40.143936058 +0000 UTC m=+0.177358776 container start f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:57:40 localhost dnsmasq[319125]: started, version 2.85 cachesize 150 Feb 1 04:57:40 localhost dnsmasq[319125]: DNS service limited to local subnets Feb 1 04:57:40 localhost dnsmasq[319125]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:40 localhost dnsmasq[319125]: warning: no upstream servers configured Feb 1 04:57:40 localhost dnsmasq-dhcp[319125]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:57:40 localhost dnsmasq[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:40 localhost dnsmasq-dhcp[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:40 localhost dnsmasq-dhcp[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:40.205 259320 INFO neutron.agent.dhcp.agent [None req-453fadbc-7be1-46c9-885f-e562b48dc6db - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:38Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=44042f6d-94e5-426d-9202-2174d3b0cf5f, ip_allocation=immediate, mac_address=fa:16:3e:9d:7a:82, name=tempest-NetworksTestDHCPv6-362094661, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['52bb3c3a-e250-4696-8796-1d8a7f15615c', 'e530585f-96ad-4cde-b6f7-a286f3968f91'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:37Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2274, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:39Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:57:40 localhost nova_compute[274651]: 2026-02-01 09:57:40.323 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:40.339 259320 INFO neutron.agent.dhcp.agent [None req-93def749-4505-48b2-ac24-f59c6b38fe85 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:40 localhost dnsmasq[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:57:40 localhost dnsmasq-dhcp[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:40 localhost dnsmasq-dhcp[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:40 localhost podman[319144]: 2026-02-01 09:57:40.437184287 +0000 UTC m=+0.060232854 container kill f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:40 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:40.515 2 INFO neutron.agent.securitygroups_rpc [None req-a0493fd0-d741-481f-ad73-307c48cc986a afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group rule updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']#033[00m Feb 1 04:57:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:40.610 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:40Z, description=, device_id=d1894ea0-9c72-4dde-9d88-81ac177a3e3b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6b7a575f-0ba5-4865-9cf0-925996d946ef, ip_allocation=immediate, mac_address=fa:16:3e:32:f7:ee, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:32Z, description=, dns_domain=, id=1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1114537696, port_security_enabled=True, project_id=bdd313217db54b0aa18a483b1bae89ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11804, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2235, status=ACTIVE, subnets=['eb88342a-236c-44d1-85ca-6b3a59346fe9'], tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:35Z, vlan_transparent=None, network_id=1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, port_security_enabled=False, project_id=bdd313217db54b0aa18a483b1bae89ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2280, status=DOWN, tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:40Z on network 1dd45b6e-c7d4-4daf-9d1e-f7981179cd48#033[00m Feb 1 04:57:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:40.671 259320 INFO neutron.agent.dhcp.agent [None req-d3e51c09-b16b-4d1c-942f-6a94c0c09957 - - - - - -] DHCP configuration for ports {'44042f6d-94e5-426d-9202-2174d3b0cf5f'} is completed#033[00m Feb 1 04:57:40 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:40.784 2 INFO neutron.agent.securitygroups_rpc [None req-ffd61713-cff0-4f50-b6e4-2930ab2a8c56 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:40 localhost dnsmasq[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/addn_hosts - 1 addresses Feb 1 04:57:40 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/host Feb 1 04:57:40 localhost podman[319181]: 2026-02-01 09:57:40.848158437 +0000 UTC m=+0.076010449 container kill 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:40 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/opts Feb 1 04:57:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:57:40 localhost systemd[1]: tmp-crun.eW7xD7.mount: Deactivated successfully. Feb 1 04:57:40 localhost podman[319202]: 2026-02-01 09:57:40.981649083 +0000 UTC m=+0.110537650 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:57:41 localhost dnsmasq[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:41 localhost dnsmasq-dhcp[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:41 localhost dnsmasq-dhcp[319125]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:41 localhost podman[319224]: 2026-02-01 09:57:41.019051443 +0000 UTC m=+0.069941322 container kill f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:57:41 localhost podman[319202]: 2026-02-01 09:57:41.045433065 +0000 UTC m=+0.174321632 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 1 04:57:41 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:57:41 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:41.148 259320 INFO neutron.agent.dhcp.agent [None req-a11a14cc-fa70-4951-8e8f-972c9418e08d - - - - - -] DHCP configuration for ports {'6b7a575f-0ba5-4865-9cf0-925996d946ef'} is completed#033[00m Feb 1 04:57:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:57:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:57:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:57:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e165 do_prune osdmap full prune enabled Feb 1 04:57:41 localhost podman[319278]: 2026-02-01 09:57:41.699084299 +0000 UTC m=+0.065278728 container kill f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:57:41 localhost dnsmasq[319125]: exiting on receipt of SIGTERM Feb 1 04:57:41 localhost systemd[1]: libpod-f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78.scope: Deactivated successfully. Feb 1 04:57:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e166 e166: 6 total, 6 up, 6 in Feb 1 04:57:41 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e166: 6 total, 6 up, 6 in Feb 1 04:57:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:41.719 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:57:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:41.720 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:57:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:41.721 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:57:41 localhost podman[319292]: 2026-02-01 09:57:41.783180176 +0000 UTC m=+0.063717251 container died f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:57:41 localhost systemd[1]: tmp-crun.PyxSQW.mount: Deactivated successfully. Feb 1 04:57:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:41 localhost podman[319292]: 2026-02-01 09:57:41.834739772 +0000 UTC m=+0.115276757 container remove f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:57:41 localhost systemd[1]: libpod-conmon-f35886fe485f2addcb241fd33108ba1ba857cd2857899965a7c5bde9e7ceeb78.scope: Deactivated successfully. Feb 1 04:57:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:41.939 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:57:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:42.265 2 INFO neutron.agent.securitygroups_rpc [None req-cf10ff71-4763-464e-9c85-a62c4de0813a 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']#033[00m Feb 1 04:57:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:42.512 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:40Z, description=, device_id=d1894ea0-9c72-4dde-9d88-81ac177a3e3b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6b7a575f-0ba5-4865-9cf0-925996d946ef, ip_allocation=immediate, mac_address=fa:16:3e:32:f7:ee, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:32Z, description=, dns_domain=, id=1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-router-network01--1114537696, port_security_enabled=True, project_id=bdd313217db54b0aa18a483b1bae89ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11804, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2235, status=ACTIVE, subnets=['eb88342a-236c-44d1-85ca-6b3a59346fe9'], tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:35Z, vlan_transparent=None, network_id=1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, port_security_enabled=False, project_id=bdd313217db54b0aa18a483b1bae89ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2280, status=DOWN, tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:40Z on network 1dd45b6e-c7d4-4daf-9d1e-f7981179cd48#033[00m Feb 1 04:57:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:57:42 localhost podman[319378]: Feb 1 04:57:42 localhost systemd[1]: var-lib-containers-storage-overlay-13ac61d11e921e99b96ebfcddd3f3dd8289970251dc0bb4e0e552de48728930a-merged.mount: Deactivated successfully. Feb 1 04:57:42 localhost podman[319378]: 2026-02-01 09:57:42.697670634 +0000 UTC m=+0.084655816 container create 3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:42 localhost dnsmasq[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/addn_hosts - 1 addresses Feb 1 04:57:42 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/host Feb 1 04:57:42 localhost podman[319396]: 2026-02-01 09:57:42.726185621 +0000 UTC m=+0.071224483 container kill 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:57:42 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/opts Feb 1 04:57:42 localhost systemd[1]: tmp-crun.HhacOk.mount: Deactivated successfully. Feb 1 04:57:42 localhost systemd[1]: Started libpod-conmon-3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991.scope. Feb 1 04:57:42 localhost podman[319378]: 2026-02-01 09:57:42.647774839 +0000 UTC m=+0.034760091 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:42 localhost systemd[1]: Started libcrun container. Feb 1 04:57:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52c7a5dc66010afee155d13981264822d43da239b6f969d630f80e18b7e64348/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:42 localhost podman[319378]: 2026-02-01 09:57:42.767636946 +0000 UTC m=+0.154622168 container init 3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:42 localhost podman[319378]: 2026-02-01 09:57:42.77622304 +0000 UTC m=+0.163208252 container start 3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 1 04:57:42 localhost dnsmasq[319416]: started, version 2.85 cachesize 150 Feb 1 04:57:42 localhost dnsmasq[319416]: DNS service limited to local subnets Feb 1 04:57:42 localhost dnsmasq[319416]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:42 localhost dnsmasq[319416]: warning: no upstream servers configured Feb 1 04:57:42 localhost dnsmasq-dhcp[319416]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:57:42 localhost dnsmasq[319416]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:42 localhost dnsmasq-dhcp[319416]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:42 localhost dnsmasq-dhcp[319416]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:43.058 259320 INFO neutron.agent.dhcp.agent [None req-8cfd73dd-53f5-44ef-ab91-27fa9e2687af - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:43 localhost dnsmasq[319416]: exiting on receipt of SIGTERM Feb 1 04:57:43 localhost podman[319439]: 2026-02-01 09:57:43.197419094 +0000 UTC m=+0.072576663 container kill 3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:57:43 localhost systemd[1]: libpod-3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991.scope: Deactivated successfully. Feb 1 04:57:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:43.216 259320 INFO neutron.agent.dhcp.agent [None req-47c8f995-a3e3-425d-ac3b-8078b5ca18f8 - - - - - -] DHCP configuration for ports {'6b7a575f-0ba5-4865-9cf0-925996d946ef'} is completed#033[00m Feb 1 04:57:43 localhost podman[319454]: 2026-02-01 09:57:43.264287511 +0000 UTC m=+0.047110740 container died 3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:57:43 localhost podman[319454]: 2026-02-01 09:57:43.30456513 +0000 UTC m=+0.087388279 container remove 3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:43 localhost systemd[1]: libpod-conmon-3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991.scope: Deactivated successfully. Feb 1 04:57:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:43.502 259320 INFO neutron.agent.linux.ip_lib [None req-44de0fa1-c770-4d0b-87a4-58b84c39eb30 - - - - - -] Device tapd18bcbbe-cf cannot be used as it has no MAC address#033[00m Feb 1 04:57:43 localhost nova_compute[274651]: 2026-02-01 09:57:43.534 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:43 localhost kernel: device tapd18bcbbe-cf entered promiscuous mode Feb 1 04:57:43 localhost ovn_controller[152492]: 2026-02-01T09:57:43Z|00354|binding|INFO|Claiming lport d18bcbbe-cfd4-4b02-976c-d52a30b72940 for this chassis. Feb 1 04:57:43 localhost NetworkManager[5964]: [1769939863.5425] manager: (tapd18bcbbe-cf): new Generic device (/org/freedesktop/NetworkManager/Devices/63) Feb 1 04:57:43 localhost ovn_controller[152492]: 2026-02-01T09:57:43Z|00355|binding|INFO|d18bcbbe-cfd4-4b02-976c-d52a30b72940: Claiming unknown Feb 1 04:57:43 localhost nova_compute[274651]: 2026-02-01 09:57:43.546 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:43 localhost systemd-udevd[319503]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:43.553 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-38c20446-2361-4928-a65c-3acdf0e0c2fb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38c20446-2361-4928-a65c-3acdf0e0c2fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f4729d0-f48c-4cda-8c67-1678f185dc7e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d18bcbbe-cfd4-4b02-976c-d52a30b72940) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:43.555 158365 INFO neutron.agent.ovn.metadata.agent [-] Port d18bcbbe-cfd4-4b02-976c-d52a30b72940 in datapath 38c20446-2361-4928-a65c-3acdf0e0c2fb bound to our chassis#033[00m Feb 1 04:57:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:43.558 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 38c20446-2361-4928-a65c-3acdf0e0c2fb or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:43 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:43.562 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[c6a0d6b0-0064-4c54-b07a-cc79d7f985d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost nova_compute[274651]: 2026-02-01 09:57:43.573 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:43 localhost ovn_controller[152492]: 2026-02-01T09:57:43Z|00356|binding|INFO|Setting lport d18bcbbe-cfd4-4b02-976c-d52a30b72940 ovn-installed in OVS Feb 1 04:57:43 localhost ovn_controller[152492]: 2026-02-01T09:57:43Z|00357|binding|INFO|Setting lport d18bcbbe-cfd4-4b02-976c-d52a30b72940 up in Southbound Feb 1 04:57:43 localhost nova_compute[274651]: 2026-02-01 09:57:43.577 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost journal[217584]: ethtool ioctl error on tapd18bcbbe-cf: No such device Feb 1 04:57:43 localhost nova_compute[274651]: 2026-02-01 09:57:43.619 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:43 localhost nova_compute[274651]: 2026-02-01 09:57:43.649 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:43 localhost systemd[1]: var-lib-containers-storage-overlay-52c7a5dc66010afee155d13981264822d43da239b6f969d630f80e18b7e64348-merged.mount: Deactivated successfully. Feb 1 04:57:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ce951809a543b33f9543ba1ac858090f5dfdeab730452a099d69f1a2f2ab991-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:44 localhost dnsmasq[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/addn_hosts - 0 addresses Feb 1 04:57:44 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/host Feb 1 04:57:44 localhost dnsmasq-dhcp[318836]: read /var/lib/neutron/dhcp/1dd45b6e-c7d4-4daf-9d1e-f7981179cd48/opts Feb 1 04:57:44 localhost podman[319576]: 2026-02-01 09:57:44.070520529 +0000 UTC m=+0.070767229 container kill 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:44 localhost podman[319617]: Feb 1 04:57:44 localhost podman[319617]: 2026-02-01 09:57:44.187528377 +0000 UTC m=+0.064682391 container create bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:57:44 localhost systemd[1]: Started libpod-conmon-bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b.scope. Feb 1 04:57:44 localhost systemd[1]: Started libcrun container. Feb 1 04:57:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dea27f4e57b98081827fef739f270097e7c64ef59c6e9c4e101cc1d21ecfbbe2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:44 localhost podman[319617]: 2026-02-01 09:57:44.248722309 +0000 UTC m=+0.125876353 container init bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:57:44 localhost podman[319617]: 2026-02-01 09:57:44.156546774 +0000 UTC m=+0.033700788 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:44 localhost podman[319617]: 2026-02-01 09:57:44.25622265 +0000 UTC m=+0.133376684 container start bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:57:44 localhost dnsmasq[319644]: started, version 2.85 cachesize 150 Feb 1 04:57:44 localhost dnsmasq[319644]: DNS service limited to local subnets Feb 1 04:57:44 localhost dnsmasq[319644]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:44 localhost dnsmasq[319644]: warning: no upstream servers configured Feb 1 04:57:44 localhost dnsmasq-dhcp[319644]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:44 localhost dnsmasq[319644]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:44 localhost dnsmasq-dhcp[319644]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:44 localhost dnsmasq-dhcp[319644]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:44 localhost kernel: device tap255c4aaa-d3 left promiscuous mode Feb 1 04:57:44 localhost ovn_controller[152492]: 2026-02-01T09:57:44Z|00358|binding|INFO|Releasing lport 255c4aaa-d3b5-49f2-a806-fad4e551d461 from this chassis (sb_readonly=0) Feb 1 04:57:44 localhost ovn_controller[152492]: 2026-02-01T09:57:44Z|00359|binding|INFO|Setting lport 255c4aaa-d3b5-49f2-a806-fad4e551d461 down in Southbound Feb 1 04:57:44 localhost nova_compute[274651]: 2026-02-01 09:57:44.289 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:44.300 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0346851d-fcce-43f5-93ab-c0abe62c3f50, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=255c4aaa-d3b5-49f2-a806-fad4e551d461) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:44.301 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 255c4aaa-d3b5-49f2-a806-fad4e551d461 in datapath 1dd45b6e-c7d4-4daf-9d1e-f7981179cd48 unbound from our chassis#033[00m Feb 1 04:57:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:44.302 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:44.303 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[a5e2c44b-73b8-4ab7-8cb7-15df90c798b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:44 localhost nova_compute[274651]: 2026-02-01 09:57:44.309 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:44.496 259320 INFO neutron.agent.dhcp.agent [None req-31a77302-04fd-4a14-9265-5dcf09d2bdd5 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:44 localhost podman[319667]: Feb 1 04:57:44 localhost podman[319667]: 2026-02-01 09:57:44.569850227 +0000 UTC m=+0.096435897 container create a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 1 04:57:44 localhost systemd[1]: Started libpod-conmon-a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5.scope. Feb 1 04:57:44 localhost dnsmasq[319644]: exiting on receipt of SIGTERM Feb 1 04:57:44 localhost podman[319697]: 2026-02-01 09:57:44.621901308 +0000 UTC m=+0.039884039 container kill bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:57:44 localhost systemd[1]: Started libcrun container. Feb 1 04:57:44 localhost systemd[1]: libpod-bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b.scope: Deactivated successfully. Feb 1 04:57:44 localhost podman[319667]: 2026-02-01 09:57:44.525723199 +0000 UTC m=+0.052308879 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b202a61d64f874f48c4bc9f7234202fb39b020782b38262ebee94aead9cde87b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:44 localhost podman[319667]: 2026-02-01 09:57:44.635850227 +0000 UTC m=+0.162435877 container init a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:57:44 localhost podman[319667]: 2026-02-01 09:57:44.64213001 +0000 UTC m=+0.168715660 container start a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:57:44 localhost dnsmasq[319728]: started, version 2.85 cachesize 150 Feb 1 04:57:44 localhost dnsmasq[319728]: DNS service limited to local subnets Feb 1 04:57:44 localhost dnsmasq[319728]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:44 localhost dnsmasq[319728]: warning: no upstream servers configured Feb 1 04:57:44 localhost dnsmasq-dhcp[319728]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:57:44 localhost dnsmasq[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/addn_hosts - 0 addresses Feb 1 04:57:44 localhost dnsmasq-dhcp[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/host Feb 1 04:57:44 localhost dnsmasq-dhcp[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/opts Feb 1 04:57:44 localhost podman[319716]: 2026-02-01 09:57:44.697042228 +0000 UTC m=+0.056039944 container died bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:44 localhost systemd[1]: var-lib-containers-storage-overlay-dea27f4e57b98081827fef739f270097e7c64ef59c6e9c4e101cc1d21ecfbbe2-merged.mount: Deactivated successfully. Feb 1 04:57:44 localhost podman[319716]: 2026-02-01 09:57:44.741034132 +0000 UTC m=+0.100031838 container remove bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:44 localhost systemd[1]: libpod-conmon-bb08d9d72a2b49274f0a92324891d278db4e7c165d62c08ecda275cf7d7c343b.scope: Deactivated successfully. Feb 1 04:57:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:44.756 259320 INFO neutron.agent.dhcp.agent [None req-134c16e6-dff8-4df7-a7da-76908fe98a0c - - - - - -] DHCP configuration for ports {'21ac26c9-c252-4f2b-b6b3-cd97f5847899'} is completed#033[00m Feb 1 04:57:45 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:45.127 2 INFO neutron.agent.securitygroups_rpc [None req-1b4acf80-44e4-4b72-89ee-2b772b9e0127 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:45 localhost nova_compute[274651]: 2026-02-01 09:57:45.353 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:45 localhost nova_compute[274651]: 2026-02-01 09:57:45.359 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:45 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:45.706 2 INFO neutron.agent.securitygroups_rpc [None req-83cc4c41-7e96-4a99-8728-f0acec1b6354 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:46 localhost podman[319793]: Feb 1 04:57:46 localhost podman[319793]: 2026-02-01 09:57:46.147386677 +0000 UTC m=+0.091014630 container create 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 1 04:57:46 localhost systemd[1]: Started libpod-conmon-881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283.scope. Feb 1 04:57:46 localhost systemd[1]: Started libcrun container. Feb 1 04:57:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e5ae219dedf6302106ee01751a6caf1f15ef7b63926d67a6318c40465b5b53c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:46 localhost podman[319793]: 2026-02-01 09:57:46.108180711 +0000 UTC m=+0.051808634 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:46 localhost podman[319793]: 2026-02-01 09:57:46.21118699 +0000 UTC m=+0.154814883 container init 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:46.220 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:45Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=45f11f3e-8e4a-4064-8576-ce3756ab73bd, ip_allocation=immediate, mac_address=fa:16:3e:9d:a7:99, name=tempest-PortsTestJSON-336881785, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:40Z, description=, dns_domain=, id=38c20446-2361-4928-a65c-3acdf0e0c2fb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-803552649, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23209, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2290, status=ACTIVE, subnets=['933eaa47-e09f-479b-97fe-f426831903dd'], tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:57:42Z, vlan_transparent=None, network_id=38c20446-2361-4928-a65c-3acdf0e0c2fb, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2314, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:57:45Z on network 38c20446-2361-4928-a65c-3acdf0e0c2fb#033[00m Feb 1 04:57:46 localhost podman[319793]: 2026-02-01 09:57:46.227427929 +0000 UTC m=+0.171055822 container start 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:46 localhost dnsmasq[319811]: started, version 2.85 cachesize 150 Feb 1 04:57:46 localhost dnsmasq[319811]: DNS service limited to local subnets Feb 1 04:57:46 localhost dnsmasq[319811]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:46 localhost dnsmasq[319811]: warning: no upstream servers configured Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:46 localhost dnsmasq[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:46 localhost dnsmasq[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/addn_hosts - 1 addresses Feb 1 04:57:46 localhost dnsmasq-dhcp[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/host Feb 1 04:57:46 localhost dnsmasq-dhcp[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/opts Feb 1 04:57:46 localhost podman[319829]: 2026-02-01 09:57:46.434314882 +0000 UTC m=+0.050552816 container kill a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:57:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:46.520 259320 INFO neutron.agent.dhcp.agent [None req-729e3ff8-77c0-475f-aa97-0ced614576ec - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:46 localhost dnsmasq[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:46 localhost podman[319865]: 2026-02-01 09:57:46.668033091 +0000 UTC m=+0.065586688 container kill 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:57:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:46.754 259320 INFO neutron.agent.dhcp.agent [None req-caf4a053-2aff-4ba1-a60e-a5022213a198 - - - - - -] DHCP configuration for ports {'45f11f3e-8e4a-4064-8576-ce3756ab73bd'} is completed#033[00m Feb 1 04:57:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:46.794 259320 INFO neutron.agent.dhcp.agent [None req-7f480caa-14e8-4dca-af42-71412bfe501a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:44Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=d311fa97-7b51-42a9-8443-8a5cdcea3aad, ip_allocation=immediate, mac_address=fa:16:3e:0a:fc:58, name=tempest-NetworksTestDHCPv6-1344505266, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['7c7257bc-8732-48f1-a140-33e5025d77ba', 'c2f637cf-1251-401c-b858-c17914bf6ed1'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:44Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2309, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:44Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:57:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:46.949 259320 INFO neutron.agent.dhcp.agent [None req-f3d96538-622c-487f-87dc-b1c1b6caf9b2 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:46 localhost dnsmasq[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:46 localhost dnsmasq-dhcp[319811]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:46 localhost podman[319905]: 2026-02-01 09:57:46.983983359 +0000 UTC m=+0.058248163 container kill 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:57:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:57:47 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:47.204 259320 INFO neutron.agent.dhcp.agent [None req-cc3822c4-21dd-4815-8a3e-bd1485f4a557 - - - - - -] DHCP configuration for ports {'d311fa97-7b51-42a9-8443-8a5cdcea3aad'} is completed#033[00m Feb 1 04:57:47 localhost podman[319927]: 2026-02-01 09:57:47.232761041 +0000 UTC m=+0.087532464 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, release=1769056855, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, version=9.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 04:57:47 localhost podman[319927]: 2026-02-01 09:57:47.269725137 +0000 UTC m=+0.124496550 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, version=9.7, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7) Feb 1 04:57:47 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:57:47 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:47.311 2 INFO neutron.agent.securitygroups_rpc [req-501f4cee-4307-45db-b319-f1b9ce6bf1c9 req-e1bd36d6-c912-4d1e-9e97-4f42b13c68e6 afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group member updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']#033[00m Feb 1 04:57:47 localhost dnsmasq[319811]: exiting on receipt of SIGTERM Feb 1 04:57:47 localhost podman[319974]: 2026-02-01 09:57:47.429071519 +0000 UTC m=+0.034517404 container kill 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:57:47 localhost systemd[1]: libpod-881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283.scope: Deactivated successfully. Feb 1 04:57:47 localhost podman[319992]: 2026-02-01 09:57:47.467598903 +0000 UTC m=+0.031711166 container died 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:57:47 localhost podman[319992]: 2026-02-01 09:57:47.543824808 +0000 UTC m=+0.107937031 container cleanup 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:47 localhost systemd[1]: libpod-conmon-881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283.scope: Deactivated successfully. Feb 1 04:57:47 localhost podman[319999]: 2026-02-01 09:57:47.597695545 +0000 UTC m=+0.148297343 container remove 881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:57:47 localhost dnsmasq[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/addn_hosts - 0 addresses Feb 1 04:57:47 localhost dnsmasq-dhcp[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/host Feb 1 04:57:47 localhost podman[320022]: 2026-02-01 09:57:47.647816946 +0000 UTC m=+0.073525533 container kill a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:47 localhost dnsmasq-dhcp[319728]: read /var/lib/neutron/dhcp/38c20446-2361-4928-a65c-3acdf0e0c2fb/opts Feb 1 04:57:48 localhost systemd[1]: var-lib-containers-storage-overlay-2e5ae219dedf6302106ee01751a6caf1f15ef7b63926d67a6318c40465b5b53c-merged.mount: Deactivated successfully. Feb 1 04:57:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-881683621b6039305ed5fbe125074d9f944a428c65476d9da31b12f7c499c283-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:48.258 259320 INFO neutron.agent.linux.ip_lib [None req-27d81417-d0a9-4319-9dd2-7a1d3f2b0d34 - - - - - -] Device tap929c16d7-41 cannot be used as it has no MAC address#033[00m Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.288 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost kernel: device tap929c16d7-41 entered promiscuous mode Feb 1 04:57:48 localhost ovn_controller[152492]: 2026-02-01T09:57:48Z|00360|binding|INFO|Claiming lport 929c16d7-415f-480f-bb46-d257660067a0 for this chassis. Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.296 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost ovn_controller[152492]: 2026-02-01T09:57:48Z|00361|binding|INFO|929c16d7-415f-480f-bb46-d257660067a0: Claiming unknown Feb 1 04:57:48 localhost NetworkManager[5964]: [1769939868.3010] manager: (tap929c16d7-41): new Generic device (/org/freedesktop/NetworkManager/Devices/64) Feb 1 04:57:48 localhost systemd-udevd[320112]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.312 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-b2c97c71-c24e-4ce0-97b0-09b52ce12d05', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c97c71-c24e-4ce0-97b0-09b52ce12d05', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ea8b9bc-1f0f-4e8e-806a-72eed665b7b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=929c16d7-415f-480f-bb46-d257660067a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.315 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 929c16d7-415f-480f-bb46-d257660067a0 in datapath b2c97c71-c24e-4ce0-97b0-09b52ce12d05 bound to our chassis#033[00m Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.320 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port b68a5291-ff17-4172-b964-4dc7839bc880 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.321 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c97c71-c24e-4ce0-97b0-09b52ce12d05, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.322 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d15d2776-300d-4f53-a421-20e42abb3242]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:48 localhost podman[320103]: Feb 1 04:57:48 localhost ovn_controller[152492]: 2026-02-01T09:57:48Z|00362|binding|INFO|Setting lport 929c16d7-415f-480f-bb46-d257660067a0 ovn-installed in OVS Feb 1 04:57:48 localhost ovn_controller[152492]: 2026-02-01T09:57:48Z|00363|binding|INFO|Setting lport 929c16d7-415f-480f-bb46-d257660067a0 up in Southbound Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.339 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.340 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost podman[320103]: 2026-02-01 09:57:48.341585124 +0000 UTC m=+0.064232276 container create 23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:48.352 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:48Z, description=, device_id=d1894ea0-9c72-4dde-9d88-81ac177a3e3b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=268b650e-797b-4fe3-8c47-893465976062, ip_allocation=immediate, mac_address=fa:16:3e:66:8c:f2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:29Z, description=, dns_domain=, id=206ee553-cd65-4f77-8129-196aa5aa2858, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-846400062, port_security_enabled=True, project_id=bdd313217db54b0aa18a483b1bae89ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=376, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2211, status=ACTIVE, subnets=['fb5b7d99-427d-430d-af99-4442f900c00b'], tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:31Z, vlan_transparent=None, network_id=206ee553-cd65-4f77-8129-196aa5aa2858, port_security_enabled=False, project_id=bdd313217db54b0aa18a483b1bae89ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2325, status=DOWN, tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:48Z on network 206ee553-cd65-4f77-8129-196aa5aa2858#033[00m Feb 1 04:57:48 localhost systemd[1]: Started libpod-conmon-23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e.scope. Feb 1 04:57:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e166 do_prune osdmap full prune enabled Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.383 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost systemd[1]: Started libcrun container. Feb 1 04:57:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e167 e167: 6 total, 6 up, 6 in Feb 1 04:57:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03ed3016bfdffed4edb3acc1cf7124ed216053d997e4d7423a28d6071e48862e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:48 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e167: 6 total, 6 up, 6 in Feb 1 04:57:48 localhost podman[320103]: 2026-02-01 09:57:48.399928299 +0000 UTC m=+0.122575471 container init 23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:48 localhost podman[320103]: 2026-02-01 09:57:48.409940597 +0000 UTC m=+0.132587769 container start 23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 04:57:48 localhost podman[320103]: 2026-02-01 09:57:48.315586435 +0000 UTC m=+0.038233587 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:48 localhost dnsmasq[320149]: started, version 2.85 cachesize 150 Feb 1 04:57:48 localhost dnsmasq[320149]: DNS service limited to local subnets Feb 1 04:57:48 localhost dnsmasq[320149]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:48 localhost dnsmasq[320149]: warning: no upstream servers configured Feb 1 04:57:48 localhost dnsmasq-dhcp[320149]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:57:48 localhost dnsmasq[320149]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:48 localhost dnsmasq-dhcp[320149]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:48 localhost dnsmasq-dhcp[320149]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.434 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost dnsmasq[319728]: exiting on receipt of SIGTERM Feb 1 04:57:48 localhost podman[320166]: 2026-02-01 09:57:48.51572427 +0000 UTC m=+0.045163180 container kill a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:48 localhost systemd[1]: libpod-a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5.scope: Deactivated successfully. Feb 1 04:57:48 localhost dnsmasq[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/addn_hosts - 1 addresses Feb 1 04:57:48 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/host Feb 1 04:57:48 localhost podman[320177]: 2026-02-01 09:57:48.546553598 +0000 UTC m=+0.063676069 container kill b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:48 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/opts Feb 1 04:57:48 localhost podman[320198]: 2026-02-01 09:57:48.573662472 +0000 UTC m=+0.047092999 container died a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:48 localhost podman[320198]: 2026-02-01 09:57:48.605541943 +0000 UTC m=+0.078972450 container cleanup a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:48 localhost systemd[1]: libpod-conmon-a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5.scope: Deactivated successfully. Feb 1 04:57:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:48.632 259320 INFO neutron.agent.dhcp.agent [None req-a0cc341c-1b02-4324-92fb-b3a177b02034 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:48 localhost podman[320200]: 2026-02-01 09:57:48.642010565 +0000 UTC m=+0.111711747 container remove a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-38c20446-2361-4928-a65c-3acdf0e0c2fb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.658 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost kernel: device tapd18bcbbe-cf left promiscuous mode Feb 1 04:57:48 localhost ovn_controller[152492]: 2026-02-01T09:57:48Z|00364|binding|INFO|Releasing lport d18bcbbe-cfd4-4b02-976c-d52a30b72940 from this chassis (sb_readonly=0) Feb 1 04:57:48 localhost ovn_controller[152492]: 2026-02-01T09:57:48Z|00365|binding|INFO|Setting lport d18bcbbe-cfd4-4b02-976c-d52a30b72940 down in Southbound Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.673 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-38c20446-2361-4928-a65c-3acdf0e0c2fb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38c20446-2361-4928-a65c-3acdf0e0c2fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8f4729d0-f48c-4cda-8c67-1678f185dc7e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d18bcbbe-cfd4-4b02-976c-d52a30b72940) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.674 158365 INFO neutron.agent.ovn.metadata.agent [-] Port d18bcbbe-cfd4-4b02-976c-d52a30b72940 in datapath 38c20446-2361-4928-a65c-3acdf0e0c2fb unbound from our chassis#033[00m Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.677 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38c20446-2361-4928-a65c-3acdf0e0c2fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:48 localhost nova_compute[274651]: 2026-02-01 09:57:48.678 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:48.678 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[8c04bdb1-1bc9-4d8b-a133-e50da2b1f2d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:48 localhost dnsmasq[320149]: exiting on receipt of SIGTERM Feb 1 04:57:48 localhost systemd[1]: libpod-23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e.scope: Deactivated successfully. Feb 1 04:57:48 localhost podman[320253]: 2026-02-01 09:57:48.749272984 +0000 UTC m=+0.050035680 container kill 23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:48 localhost podman[320271]: 2026-02-01 09:57:48.803155121 +0000 UTC m=+0.046441039 container died 23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:57:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:48.830 259320 INFO neutron.agent.dhcp.agent [None req-7c4504af-8788-4d6a-8a4f-07c4d5086937 - - - - - -] DHCP configuration for ports {'268b650e-797b-4fe3-8c47-893465976062'} is completed#033[00m Feb 1 04:57:48 localhost podman[320271]: 2026-02-01 09:57:48.879172289 +0000 UTC m=+0.122458187 container cleanup 23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:48 localhost systemd[1]: libpod-conmon-23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e.scope: Deactivated successfully. Feb 1 04:57:48 localhost podman[320278]: 2026-02-01 09:57:48.900057502 +0000 UTC m=+0.129709401 container remove 23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:49.134 259320 INFO neutron.agent.dhcp.agent [None req-c4f79187-b59d-4e01-bff7-0316d886ccd4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:49.135 259320 INFO neutron.agent.dhcp.agent [None req-c4f79187-b59d-4e01-bff7-0316d886ccd4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:49 localhost systemd[1]: var-lib-containers-storage-overlay-03ed3016bfdffed4edb3acc1cf7124ed216053d997e4d7423a28d6071e48862e-merged.mount: Deactivated successfully. Feb 1 04:57:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23b695c47e86f840107d9787ad8d33d95e81297bfb7cd6af600a025ea1427a1e-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:49 localhost systemd[1]: var-lib-containers-storage-overlay-b202a61d64f874f48c4bc9f7234202fb39b020782b38262ebee94aead9cde87b-merged.mount: Deactivated successfully. Feb 1 04:57:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a33ff9b6521960adee3f6db9dcdbc10868b0b1c80cafd34811f923e9bdf33cf5-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:49 localhost systemd[1]: run-netns-qdhcp\x2d38c20446\x2d2361\x2d4928\x2da65c\x2d3acdf0e0c2fb.mount: Deactivated successfully. Feb 1 04:57:49 localhost podman[320326]: Feb 1 04:57:49 localhost podman[320326]: 2026-02-01 09:57:49.339646903 +0000 UTC m=+0.089869836 container create 0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c97c71-c24e-4ce0-97b0-09b52ce12d05, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:57:49 localhost systemd[1]: Started libpod-conmon-0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1.scope. Feb 1 04:57:49 localhost podman[320326]: 2026-02-01 09:57:49.300166428 +0000 UTC m=+0.050389381 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:49.422 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:49 localhost systemd[1]: Started libcrun container. Feb 1 04:57:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ee58a1a126f1927908c191fa5b87f38dbc1048c4a70a75c25eb7805d9888b73/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:49 localhost podman[320326]: 2026-02-01 09:57:49.443501687 +0000 UTC m=+0.193724580 container init 0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c97c71-c24e-4ce0-97b0-09b52ce12d05, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:57:49 localhost podman[320326]: 2026-02-01 09:57:49.454300739 +0000 UTC m=+0.204523632 container start 0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c97c71-c24e-4ce0-97b0-09b52ce12d05, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:49 localhost dnsmasq[320350]: started, version 2.85 cachesize 150 Feb 1 04:57:49 localhost dnsmasq[320350]: DNS service limited to local subnets Feb 1 04:57:49 localhost dnsmasq[320350]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:49 localhost dnsmasq[320350]: warning: no upstream servers configured Feb 1 04:57:49 localhost dnsmasq-dhcp[320350]: DHCP, static leases only on 10.101.0.0, lease time 1d Feb 1 04:57:49 localhost dnsmasq[320350]: read /var/lib/neutron/dhcp/b2c97c71-c24e-4ce0-97b0-09b52ce12d05/addn_hosts - 0 addresses Feb 1 04:57:49 localhost dnsmasq-dhcp[320350]: read /var/lib/neutron/dhcp/b2c97c71-c24e-4ce0-97b0-09b52ce12d05/host Feb 1 04:57:49 localhost dnsmasq-dhcp[320350]: read /var/lib/neutron/dhcp/b2c97c71-c24e-4ce0-97b0-09b52ce12d05/opts Feb 1 04:57:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:49.585 259320 INFO neutron.agent.dhcp.agent [None req-9a201cc8-8ca1-4e86-a939-df8f48b16bd2 - - - - - -] DHCP configuration for ports {'fd1b2935-830f-4d71-8ef2-d7eadef90520'} is completed#033[00m Feb 1 04:57:50 localhost ovn_controller[152492]: 2026-02-01T09:57:50Z|00366|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:50 localhost nova_compute[274651]: 2026-02-01 09:57:50.101 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:50.148 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:48Z, description=, device_id=d1894ea0-9c72-4dde-9d88-81ac177a3e3b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=268b650e-797b-4fe3-8c47-893465976062, ip_allocation=immediate, mac_address=fa:16:3e:66:8c:f2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:29Z, description=, dns_domain=, id=206ee553-cd65-4f77-8129-196aa5aa2858, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-846400062, port_security_enabled=True, project_id=bdd313217db54b0aa18a483b1bae89ba, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=376, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2211, status=ACTIVE, subnets=['fb5b7d99-427d-430d-af99-4442f900c00b'], tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:31Z, vlan_transparent=None, network_id=206ee553-cd65-4f77-8129-196aa5aa2858, port_security_enabled=False, project_id=bdd313217db54b0aa18a483b1bae89ba, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2325, status=DOWN, tags=[], tenant_id=bdd313217db54b0aa18a483b1bae89ba, updated_at=2026-02-01T09:57:48Z on network 206ee553-cd65-4f77-8129-196aa5aa2858#033[00m Feb 1 04:57:50 localhost podman[320398]: Feb 1 04:57:50 localhost podman[320398]: 2026-02-01 09:57:50.273907947 +0000 UTC m=+0.081682403 container create aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:50 localhost systemd[1]: Started libpod-conmon-aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc.scope. Feb 1 04:57:50 localhost systemd[1]: Started libcrun container. Feb 1 04:57:50 localhost podman[320398]: 2026-02-01 09:57:50.221361531 +0000 UTC m=+0.029136047 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09413c59ce35c436791e45fa45e50381ad45e00b899590b7f083911710f2a913/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:50 localhost dnsmasq[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/addn_hosts - 1 addresses Feb 1 04:57:50 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/host Feb 1 04:57:50 localhost podman[320427]: 2026-02-01 09:57:50.328152825 +0000 UTC m=+0.046768958 container kill b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:50 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/opts Feb 1 04:57:50 localhost podman[320398]: 2026-02-01 09:57:50.333107488 +0000 UTC m=+0.140881954 container init aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:50 localhost podman[320398]: 2026-02-01 09:57:50.339092102 +0000 UTC m=+0.146866558 container start aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:57:50 localhost dnsmasq[320446]: started, version 2.85 cachesize 150 Feb 1 04:57:50 localhost dnsmasq[320446]: DNS service limited to local subnets Feb 1 04:57:50 localhost dnsmasq[320446]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:50 localhost dnsmasq[320446]: warning: no upstream servers configured Feb 1 04:57:50 localhost dnsmasq-dhcp[320446]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:50 localhost dnsmasq[320446]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:50 localhost dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:50 localhost dnsmasq-dhcp[320446]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:50 localhost nova_compute[274651]: 2026-02-01 09:57:50.382 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e167 do_prune osdmap full prune enabled Feb 1 04:57:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e168 e168: 6 total, 6 up, 6 in Feb 1 04:57:50 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e168: 6 total, 6 up, 6 in Feb 1 04:57:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:50.717 259320 INFO neutron.agent.dhcp.agent [None req-e855e815-4940-4e81-8660-53334153d0f1 - - - - - -] DHCP configuration for ports {'268b650e-797b-4fe3-8c47-893465976062', '390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:50.812 2 INFO neutron.agent.securitygroups_rpc [None req-c7462549-dc03-49f9-bdc5-6b707b180a08 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:51 localhost systemd[1]: tmp-crun.FFEtlV.mount: Deactivated successfully. Feb 1 04:57:51 localhost nova_compute[274651]: 2026-02-01 09:57:51.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:51 localhost dnsmasq[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/addn_hosts - 0 addresses Feb 1 04:57:51 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/host Feb 1 04:57:51 localhost podman[320472]: 2026-02-01 09:57:51.355069861 +0000 UTC m=+0.042107386 container kill b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:57:51 localhost dnsmasq-dhcp[318665]: read /var/lib/neutron/dhcp/206ee553-cd65-4f77-8129-196aa5aa2858/opts Feb 1 04:57:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e168 do_prune osdmap full prune enabled Feb 1 04:57:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e169 e169: 6 total, 6 up, 6 in Feb 1 04:57:51 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e169: 6 total, 6 up, 6 in Feb 1 04:57:51 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:51.491 2 INFO neutron.agent.securitygroups_rpc [None req-f8e6f518-1a93-44d5-bed3-15bc3df0d353 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:51 localhost kernel: device tap2e5ad63a-c6 left promiscuous mode Feb 1 04:57:51 localhost ovn_controller[152492]: 2026-02-01T09:57:51Z|00367|binding|INFO|Releasing lport 2e5ad63a-c625-47e9-9640-0e5b1fe7e73c from this chassis (sb_readonly=0) Feb 1 04:57:51 localhost ovn_controller[152492]: 2026-02-01T09:57:51Z|00368|binding|INFO|Setting lport 2e5ad63a-c625-47e9-9640-0e5b1fe7e73c down in Southbound Feb 1 04:57:51 localhost nova_compute[274651]: 2026-02-01 09:57:51.514 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.534 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-206ee553-cd65-4f77-8129-196aa5aa2858', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-206ee553-cd65-4f77-8129-196aa5aa2858', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1845ff11-b6de-4916-9124-41d0eb3eeed7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2e5ad63a-c625-47e9-9640-0e5b1fe7e73c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.536 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 2e5ad63a-c625-47e9-9640-0e5b1fe7e73c in datapath 206ee553-cd65-4f77-8129-196aa5aa2858 unbound from our chassis#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.541 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 206ee553-cd65-4f77-8129-196aa5aa2858, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.542 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[9a15cc4e-26ea-43fb-8733-8cdc4ab902e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:51 localhost nova_compute[274651]: 2026-02-01 09:57:51.544 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.762 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8:0:1:f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.764 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.767 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 6f7c1382-daf6-4dec-a5d4-3c7b7a8626eb IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.768 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:51 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:51.769 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[bf816c29-ef9a-4007-8127-39b3976cfd4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.350 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.351 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.351 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.351 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:57:52 localhost systemd[1]: tmp-crun.cjSFcm.mount: Deactivated successfully. Feb 1 04:57:52 localhost dnsmasq[320446]: exiting on receipt of SIGTERM Feb 1 04:57:52 localhost podman[320510]: 2026-02-01 09:57:52.49559309 +0000 UTC m=+0.057950604 container kill aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:52 localhost systemd[1]: libpod-aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc.scope: Deactivated successfully. Feb 1 04:57:52 localhost podman[320524]: 2026-02-01 09:57:52.557726161 +0000 UTC m=+0.050618918 container died aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:52 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:52.586 2 INFO neutron.agent.securitygroups_rpc [None req-78a64591-c841-4e75-af76-a7af0cedc758 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:52 localhost podman[320524]: 2026-02-01 09:57:52.594310326 +0000 UTC m=+0.087203063 container cleanup aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:57:52 localhost systemd[1]: libpod-conmon-aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc.scope: Deactivated successfully. Feb 1 04:57:52 localhost podman[320531]: 2026-02-01 09:57:52.612006931 +0000 UTC m=+0.088246466 container remove aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:52 localhost dnsmasq[320350]: exiting on receipt of SIGTERM Feb 1 04:57:52 localhost podman[320576]: 2026-02-01 09:57:52.761710575 +0000 UTC m=+0.047125291 container kill 0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c97c71-c24e-4ce0-97b0-09b52ce12d05, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:52 localhost systemd[1]: libpod-0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1.scope: Deactivated successfully. Feb 1 04:57:52 localhost podman[320592]: 2026-02-01 09:57:52.806052499 +0000 UTC m=+0.036808153 container died 0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c97c71-c24e-4ce0-97b0-09b52ce12d05, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:57:52 localhost podman[320592]: 2026-02-01 09:57:52.851163216 +0000 UTC m=+0.081918860 container cleanup 0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c97c71-c24e-4ce0-97b0-09b52ce12d05, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:57:52 localhost systemd[1]: libpod-conmon-0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1.scope: Deactivated successfully. Feb 1 04:57:52 localhost podman[320599]: 2026-02-01 09:57:52.900873236 +0000 UTC m=+0.122457448 container remove 0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2c97c71-c24e-4ce0-97b0-09b52ce12d05, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.943 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:52 localhost ovn_controller[152492]: 2026-02-01T09:57:52Z|00369|binding|INFO|Releasing lport 929c16d7-415f-480f-bb46-d257660067a0 from this chassis (sb_readonly=0) Feb 1 04:57:52 localhost kernel: device tap929c16d7-41 left promiscuous mode Feb 1 04:57:52 localhost ovn_controller[152492]: 2026-02-01T09:57:52Z|00370|binding|INFO|Setting lport 929c16d7-415f-480f-bb46-d257660067a0 down in Southbound Feb 1 04:57:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:52.954 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-b2c97c71-c24e-4ce0-97b0-09b52ce12d05', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2c97c71-c24e-4ce0-97b0-09b52ce12d05', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ea8b9bc-1f0f-4e8e-806a-72eed665b7b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=929c16d7-415f-480f-bb46-d257660067a0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:52.956 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 929c16d7-415f-480f-bb46-d257660067a0 in datapath b2c97c71-c24e-4ce0-97b0-09b52ce12d05 unbound from our chassis#033[00m Feb 1 04:57:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:52.960 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2c97c71-c24e-4ce0-97b0-09b52ce12d05, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:52.961 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[408dae8c-58d5-4613-bfd3-b873c4fbe62b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:52 localhost nova_compute[274651]: 2026-02-01 09:57:52.963 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:52.982 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:53 localhost podman[320663]: Feb 1 04:57:53 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:53.409 2 INFO neutron.agent.securitygroups_rpc [None req-ff2643e0-e24a-4f9b-a883-5340b9397f69 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:53 localhost podman[320663]: 2026-02-01 09:57:53.416383971 +0000 UTC m=+0.096576432 container create 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:57:53 localhost systemd[1]: Started libpod-conmon-0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b.scope. Feb 1 04:57:53 localhost systemd[1]: Started libcrun container. Feb 1 04:57:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23dc6480c68f50e9a114e1e1cd40ee270535ca9884e8a87982dcc32d82643af7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:53 localhost podman[320663]: 2026-02-01 09:57:53.367057814 +0000 UTC m=+0.047250305 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:53 localhost podman[320663]: 2026-02-01 09:57:53.472493437 +0000 UTC m=+0.152685908 container init 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:57:53 localhost podman[320663]: 2026-02-01 09:57:53.481171284 +0000 UTC m=+0.161363755 container start 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:57:53 localhost dnsmasq[320681]: started, version 2.85 cachesize 150 Feb 1 04:57:53 localhost dnsmasq[320681]: DNS service limited to local subnets Feb 1 04:57:53 localhost dnsmasq[320681]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:53 localhost dnsmasq[320681]: warning: no upstream servers configured Feb 1 04:57:53 localhost dnsmasq-dhcp[320681]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:53 localhost dnsmasq[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:53 localhost dnsmasq-dhcp[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:53 localhost dnsmasq-dhcp[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:53 localhost systemd[1]: tmp-crun.z6U0jH.mount: Deactivated successfully. Feb 1 04:57:53 localhost systemd[1]: var-lib-containers-storage-overlay-09413c59ce35c436791e45fa45e50381ad45e00b899590b7f083911710f2a913-merged.mount: Deactivated successfully. Feb 1 04:57:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa9847cc8294db2ce37880e8643594a3f7f4e87fd9bcb186e9b0ddcffc223ecc-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:53 localhost systemd[1]: var-lib-containers-storage-overlay-9ee58a1a126f1927908c191fa5b87f38dbc1048c4a70a75c25eb7805d9888b73-merged.mount: Deactivated successfully. Feb 1 04:57:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0984b047fac2e8ec4d53aa7e96bbcca27851a751486e55324331ce23d01492b1-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:53 localhost systemd[1]: run-netns-qdhcp\x2db2c97c71\x2dc24e\x2d4ce0\x2d97b0\x2d09b52ce12d05.mount: Deactivated successfully. Feb 1 04:57:53 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:53.557 259320 INFO neutron.agent.dhcp.agent [None req-29a04fb0-e8de-4d41-884a-5fc2736ed47c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:52Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=a4cf3f5b-b0bc-4477-8a13-d2f0ea3fe535, ip_allocation=immediate, mac_address=fa:16:3e:0d:f8:7e, name=tempest-NetworksTestDHCPv6-1814718948, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['d3d4b138-5e19-4fed-8ae2-bf17688139a3', 'ef9adb12-6d3b-4f50-be06-8a55011c537d'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:49Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2338, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:52Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:57:53 localhost nova_compute[274651]: 2026-02-01 09:57:53.581 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:57:53 localhost nova_compute[274651]: 2026-02-01 09:57:53.595 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:57:53 localhost nova_compute[274651]: 2026-02-01 09:57:53.595 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:57:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e169 do_prune osdmap full prune enabled Feb 1 04:57:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e170 e170: 6 total, 6 up, 6 in Feb 1 04:57:53 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e170: 6 total, 6 up, 6 in Feb 1 04:57:53 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:53.755 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:53 localhost systemd[1]: tmp-crun.h4Tnot.mount: Deactivated successfully. Feb 1 04:57:53 localhost dnsmasq[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:57:53 localhost dnsmasq-dhcp[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:53 localhost dnsmasq-dhcp[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:53 localhost podman[320700]: 2026-02-01 09:57:53.785120302 +0000 UTC m=+0.087655477 container kill 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:57:53 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:53.794 259320 INFO neutron.agent.dhcp.agent [None req-18f89026-7898-41ce-93ce-03ebf0aa2723 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:53 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:53.819 2 INFO neutron.agent.securitygroups_rpc [None req-865e7db1-f37b-4e27-b7e7-fae9537a70ac 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:53 localhost podman[236886]: time="2026-02-01T09:57:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:57:53 localhost podman[236886]: @ - - [01/Feb/2026:09:57:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 163826 "" "Go-http-client/1.1" Feb 1 04:57:54 localhost podman[236886]: @ - - [01/Feb/2026:09:57:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20724 "" "Go-http-client/1.1" Feb 1 04:57:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:54.027 259320 INFO neutron.agent.dhcp.agent [None req-4d5a40c4-e2f9-46a5-a8d0-14f9d9c1fb3c - - - - - -] DHCP configuration for ports {'a4cf3f5b-b0bc-4477-8a13-d2f0ea3fe535'} is completed#033[00m Feb 1 04:57:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:54.074 259320 INFO neutron.agent.linux.ip_lib [None req-76126f43-e706-4aba-8800-c2b0ddf9a2e0 - - - - - -] Device tap7d1173c0-b7 cannot be used as it has no MAC address#033[00m Feb 1 04:57:54 localhost nova_compute[274651]: 2026-02-01 09:57:54.143 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:54 localhost kernel: device tap7d1173c0-b7 entered promiscuous mode Feb 1 04:57:54 localhost NetworkManager[5964]: [1769939874.1510] manager: (tap7d1173c0-b7): new Generic device (/org/freedesktop/NetworkManager/Devices/65) Feb 1 04:57:54 localhost ovn_controller[152492]: 2026-02-01T09:57:54Z|00371|binding|INFO|Claiming lport 7d1173c0-b73c-486a-901d-8b7c17d9c38f for this chassis. Feb 1 04:57:54 localhost ovn_controller[152492]: 2026-02-01T09:57:54Z|00372|binding|INFO|7d1173c0-b73c-486a-901d-8b7c17d9c38f: Claiming unknown Feb 1 04:57:54 localhost nova_compute[274651]: 2026-02-01 09:57:54.152 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:54 localhost systemd-udevd[320731]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:57:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:54.164 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1ade4bed-9167-40b7-80f8-11d83d247043', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ade4bed-9167-40b7-80f8-11d83d247043', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '674a59d5810c453484339f60db55c64e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c226c59-57e1-4cf2-89f6-50962cc638b4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7d1173c0-b73c-486a-901d-8b7c17d9c38f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:54 localhost ovn_controller[152492]: 2026-02-01T09:57:54Z|00373|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:54.166 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7d1173c0-b73c-486a-901d-8b7c17d9c38f in datapath 1ade4bed-9167-40b7-80f8-11d83d247043 bound to our chassis#033[00m Feb 1 04:57:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:54.168 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1ade4bed-9167-40b7-80f8-11d83d247043 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:54.169 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[9b379194-fd45-4fef-b6bc-02c79bcbb623]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost nova_compute[274651]: 2026-02-01 09:57:54.192 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:54 localhost nova_compute[274651]: 2026-02-01 09:57:54.194 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:54.222 2 INFO neutron.agent.securitygroups_rpc [None req-ba46717f-c9e1-458a-88f8-c050502ffc34 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:54 localhost journal[217584]: ethtool ioctl error on tap7d1173c0-b7: No such device Feb 1 04:57:54 localhost ovn_controller[152492]: 2026-02-01T09:57:54Z|00374|binding|INFO|Setting lport 7d1173c0-b73c-486a-901d-8b7c17d9c38f ovn-installed in OVS Feb 1 04:57:54 localhost ovn_controller[152492]: 2026-02-01T09:57:54Z|00375|binding|INFO|Setting lport 7d1173c0-b73c-486a-901d-8b7c17d9c38f up in Southbound Feb 1 04:57:54 localhost nova_compute[274651]: 2026-02-01 09:57:54.232 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:54 localhost nova_compute[274651]: 2026-02-01 09:57:54.237 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:54 localhost nova_compute[274651]: 2026-02-01 09:57:54.264 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:54 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:57:54 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3300270245' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:57:54 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:54.584 2 INFO neutron.agent.securitygroups_rpc [None req-6fa3fe0c-2f0b-4fce-bb11-9a1bc41c0c58 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:57:54 localhost dnsmasq[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:54 localhost dnsmasq-dhcp[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:54 localhost dnsmasq-dhcp[320681]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:54 localhost podman[320797]: 2026-02-01 09:57:54.802212576 +0000 UTC m=+0.058667586 container kill 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:57:54 localhost podman[320809]: 2026-02-01 09:57:54.914102316 +0000 UTC m=+0.090661149 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:57:55 localhost podman[320809]: 2026-02-01 09:57:55.001305579 +0000 UTC m=+0.177864402 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 1 04:57:55 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:55.005 2 INFO neutron.agent.securitygroups_rpc [None req-021b4e45-6986-46a7-9869-b4d11b35b6ad 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:57:55 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:57:55 localhost podman[320854]: Feb 1 04:57:55 localhost podman[320854]: 2026-02-01 09:57:55.072844619 +0000 UTC m=+0.088235904 container create 68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ade4bed-9167-40b7-80f8-11d83d247043, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:55 localhost systemd[1]: Started libpod-conmon-68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8.scope. Feb 1 04:57:55 localhost systemd[1]: Started libcrun container. Feb 1 04:57:55 localhost podman[320854]: 2026-02-01 09:57:55.027724771 +0000 UTC m=+0.043116106 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50c082f3c4c233ec7aadfe3151c01eec77d689c77830c0ed26de887dc3563c75/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:55 localhost podman[320854]: 2026-02-01 09:57:55.138437167 +0000 UTC m=+0.153828422 container init 68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ade4bed-9167-40b7-80f8-11d83d247043, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:55 localhost podman[320854]: 2026-02-01 09:57:55.147618449 +0000 UTC m=+0.163009704 container start 68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ade4bed-9167-40b7-80f8-11d83d247043, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:55 localhost dnsmasq[320887]: started, version 2.85 cachesize 150 Feb 1 04:57:55 localhost dnsmasq[320887]: DNS service limited to local subnets Feb 1 04:57:55 localhost dnsmasq[320887]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:55 localhost dnsmasq[320887]: warning: no upstream servers configured Feb 1 04:57:55 localhost dnsmasq-dhcp[320887]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:55 localhost dnsmasq[320887]: read /var/lib/neutron/dhcp/1ade4bed-9167-40b7-80f8-11d83d247043/addn_hosts - 0 addresses Feb 1 04:57:55 localhost dnsmasq-dhcp[320887]: read /var/lib/neutron/dhcp/1ade4bed-9167-40b7-80f8-11d83d247043/host Feb 1 04:57:55 localhost dnsmasq-dhcp[320887]: read /var/lib/neutron/dhcp/1ade4bed-9167-40b7-80f8-11d83d247043/opts Feb 1 04:57:55 localhost dnsmasq[319084]: exiting on receipt of SIGTERM Feb 1 04:57:55 localhost podman[320891]: 2026-02-01 09:57:55.242877989 +0000 UTC m=+0.048945537 container kill f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aa1930be-537d-42cd-9add-4ed7ae12f537, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:57:55 localhost systemd[1]: libpod-f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225.scope: Deactivated successfully. Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:57:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:55.280 259320 INFO neutron.agent.dhcp.agent [None req-d22eec19-2ec5-4ebf-be52-890c0a5bb7b5 - - - - - -] DHCP configuration for ports {'e548ed59-1d51-4c3e-832a-c259547cfd8e'} is completed#033[00m Feb 1 04:57:55 localhost podman[320907]: 2026-02-01 09:57:55.302321457 +0000 UTC m=+0.037979718 container died f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aa1930be-537d-42cd-9add-4ed7ae12f537, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:55 localhost podman[320907]: 2026-02-01 09:57:55.348648023 +0000 UTC m=+0.084306274 container remove f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-aa1930be-537d-42cd-9add-4ed7ae12f537, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.392 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:55 localhost ovn_controller[152492]: 2026-02-01T09:57:55Z|00376|binding|INFO|Releasing lport b1198584-6cdd-4525-82cb-83748a05c365 from this chassis (sb_readonly=0) Feb 1 04:57:55 localhost kernel: device tapb1198584-6c left promiscuous mode Feb 1 04:57:55 localhost ovn_controller[152492]: 2026-02-01T09:57:55Z|00377|binding|INFO|Setting lport b1198584-6cdd-4525-82cb-83748a05c365 down in Southbound Feb 1 04:57:55 localhost systemd[1]: libpod-conmon-f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225.scope: Deactivated successfully. Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.408 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-aa1930be-537d-42cd-9add-4ed7ae12f537', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-aa1930be-537d-42cd-9add-4ed7ae12f537', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bdd313217db54b0aa18a483b1bae89ba', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a0163ab-36a7-43cd-9755-2c82aefcb8b7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b1198584-6cdd-4525-82cb-83748a05c365) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.410 158365 INFO neutron.agent.ovn.metadata.agent [-] Port b1198584-6cdd-4525-82cb-83748a05c365 in datapath aa1930be-537d-42cd-9add-4ed7ae12f537 unbound from our chassis#033[00m Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.410 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.413 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network aa1930be-537d-42cd-9add-4ed7ae12f537, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.414 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ea62c99b-d1b7-411b-a9ff-6c46a7eac262]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:55 localhost dnsmasq[320887]: exiting on receipt of SIGTERM Feb 1 04:57:55 localhost podman[320963]: 2026-02-01 09:57:55.534927082 +0000 UTC m=+0.058371127 container kill 68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ade4bed-9167-40b7-80f8-11d83d247043, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:57:55 localhost systemd[1]: libpod-68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8.scope: Deactivated successfully. Feb 1 04:57:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:55.571 259320 INFO neutron.agent.dhcp.agent [None req-cf718e59-12fc-4644-93ba-93913b70e7a7 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:55 localhost dnsmasq[318836]: exiting on receipt of SIGTERM Feb 1 04:57:55 localhost podman[320980]: 2026-02-01 09:57:55.573593421 +0000 UTC m=+0.060196873 container kill 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:57:55 localhost systemd[1]: libpod-25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b.scope: Deactivated successfully. Feb 1 04:57:55 localhost podman[320991]: 2026-02-01 09:57:55.595633049 +0000 UTC m=+0.050115683 container died 68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ade4bed-9167-40b7-80f8-11d83d247043, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:55 localhost podman[321020]: 2026-02-01 09:57:55.648854976 +0000 UTC m=+0.050480684 container died 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:55 localhost podman[321020]: 2026-02-01 09:57:55.688375641 +0000 UTC m=+0.090001329 container remove 25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1dd45b6e-c7d4-4daf-9d1e-f7981179cd48, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:57:55 localhost systemd[1]: libpod-conmon-25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b.scope: Deactivated successfully. Feb 1 04:57:55 localhost podman[320991]: 2026-02-01 09:57:55.727968969 +0000 UTC m=+0.182451553 container cleanup 68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ade4bed-9167-40b7-80f8-11d83d247043, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:57:55 localhost systemd[1]: libpod-conmon-68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8.scope: Deactivated successfully. Feb 1 04:57:55 localhost podman[320999]: 2026-02-01 09:57:55.75142902 +0000 UTC m=+0.192812190 container remove 68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1ade4bed-9167-40b7-80f8-11d83d247043, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.762 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:55 localhost kernel: device tap7d1173c0-b7 left promiscuous mode Feb 1 04:57:55 localhost ovn_controller[152492]: 2026-02-01T09:57:55Z|00378|binding|INFO|Releasing lport 7d1173c0-b73c-486a-901d-8b7c17d9c38f from this chassis (sb_readonly=0) Feb 1 04:57:55 localhost ovn_controller[152492]: 2026-02-01T09:57:55Z|00379|binding|INFO|Setting lport 7d1173c0-b73c-486a-901d-8b7c17d9c38f down in Southbound Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.771 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1ade4bed-9167-40b7-80f8-11d83d247043', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1ade4bed-9167-40b7-80f8-11d83d247043', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '674a59d5810c453484339f60db55c64e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5c226c59-57e1-4cf2-89f6-50962cc638b4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7d1173c0-b73c-486a-901d-8b7c17d9c38f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.773 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7d1173c0-b73c-486a-901d-8b7c17d9c38f in datapath 1ade4bed-9167-40b7-80f8-11d83d247043 unbound from our chassis#033[00m Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.774 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1ade4bed-9167-40b7-80f8-11d83d247043 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:57:55 localhost ovn_metadata_agent[158360]: 2026-02-01 09:57:55.775 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[536ce3f0-a0e5-4cab-a3f4-f9fbd8661ef3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:57:55 localhost nova_compute[274651]: 2026-02-01 09:57:55.784 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:55 localhost systemd[1]: var-lib-containers-storage-overlay-50c082f3c4c233ec7aadfe3151c01eec77d689c77830c0ed26de887dc3563c75-merged.mount: Deactivated successfully. Feb 1 04:57:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68ebf9150d9396f2daf9608cc65c607b4068d5d94f869e717c27f76cf87e4dc8-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:55 localhost systemd[1]: var-lib-containers-storage-overlay-c229a2e839d790424f080d319544b54e123dbed478734c64ddc837e6deb88e18-merged.mount: Deactivated successfully. Feb 1 04:57:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f712c6aa083892e4725086f40397b11384a94e36aedd015413132cbadc4c4225-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:55 localhost systemd[1]: run-netns-qdhcp\x2daa1930be\x2d537d\x2d42cd\x2d9add\x2d4ed7ae12f537.mount: Deactivated successfully. Feb 1 04:57:55 localhost systemd[1]: var-lib-containers-storage-overlay-316d1968581b430d9df905c860c1fa54c3ac33696fc592c9950b148ea6fddc0b-merged.mount: Deactivated successfully. Feb 1 04:57:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25175f22f4e6f59ba017eeff3afd854ef71427e4a97dad847d78f6abcd874b6b-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:56 localhost systemd[1]: run-netns-qdhcp\x2d1dd45b6e\x2dc7d4\x2d4daf\x2d9d1e\x2df7981179cd48.mount: Deactivated successfully. Feb 1 04:57:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:56.007 259320 INFO neutron.agent.dhcp.agent [None req-5f2c30db-2e39-46d7-9868-5e2894015805 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:56 localhost ovn_controller[152492]: 2026-02-01T09:57:56Z|00380|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:57:56 localhost systemd[1]: run-netns-qdhcp\x2d1ade4bed\x2d9167\x2d40b7\x2d80f8\x2d11d83d247043.mount: Deactivated successfully. Feb 1 04:57:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:56.080 259320 INFO neutron.agent.dhcp.agent [None req-c4680cf3-7737-4f85-bcbc-0cf2478b87a6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:56.081 259320 INFO neutron.agent.dhcp.agent [None req-c4680cf3-7737-4f85-bcbc-0cf2478b87a6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:56.082 259320 INFO neutron.agent.dhcp.agent [None req-c4680cf3-7737-4f85-bcbc-0cf2478b87a6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:56 localhost nova_compute[274651]: 2026-02-01 09:57:56.112 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:57:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:56.169 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:56 localhost neutron_sriov_agent[252126]: 2026-02-01 09:57:56.229 2 INFO neutron.agent.securitygroups_rpc [None req-bf304f40-e466-4d37-a7c0-f4cca9d82926 c808dfb9cb284e60ac814aa25eae5d58 3e1ea1a33e554968ba8ebaf6753c9c5d - - default default] Security group member updated ['7af9328f-e889-4487-9888-9c5f8b1745d9']#033[00m Feb 1 04:57:56 localhost nova_compute[274651]: 2026-02-01 09:57:56.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:56 localhost podman[321068]: 2026-02-01 09:57:56.286080685 +0000 UTC m=+0.050244247 container kill 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:57:56 localhost dnsmasq[320681]: exiting on receipt of SIGTERM Feb 1 04:57:56 localhost systemd[1]: libpod-0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b.scope: Deactivated successfully. Feb 1 04:57:56 localhost podman[321081]: 2026-02-01 09:57:56.360337339 +0000 UTC m=+0.061235314 container died 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:57:56 localhost podman[321081]: 2026-02-01 09:57:56.389110124 +0000 UTC m=+0.090008039 container cleanup 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:56 localhost systemd[1]: libpod-conmon-0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b.scope: Deactivated successfully. Feb 1 04:57:56 localhost podman[321083]: 2026-02-01 09:57:56.443396693 +0000 UTC m=+0.135643142 container remove 0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Feb 1 04:57:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:57:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e170 do_prune osdmap full prune enabled Feb 1 04:57:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e171 e171: 6 total, 6 up, 6 in Feb 1 04:57:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e171: 6 total, 6 up, 6 in Feb 1 04:57:56 localhost systemd[1]: var-lib-containers-storage-overlay-23dc6480c68f50e9a114e1e1cd40ee270535ca9884e8a87982dcc32d82643af7-merged.mount: Deactivated successfully. Feb 1 04:57:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ab2b927448b405d86c6fb59d73e72c36300f3488f6808ac2afe98694b9c104b-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:56 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:56.899 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:57 localhost nova_compute[274651]: 2026-02-01 09:57:57.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:57 localhost podman[321161]: Feb 1 04:57:57 localhost podman[321161]: 2026-02-01 09:57:57.398164409 +0000 UTC m=+0.093895058 container create cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:57:57 localhost systemd[1]: Started libpod-conmon-cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9.scope. Feb 1 04:57:57 localhost podman[321161]: 2026-02-01 09:57:57.350979808 +0000 UTC m=+0.046710447 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:57 localhost systemd[1]: tmp-crun.fgDGGw.mount: Deactivated successfully. Feb 1 04:57:57 localhost systemd[1]: Started libcrun container. Feb 1 04:57:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/051918edecd5449a6e38e088c20a7410199210141805246be3210d0f3d089b85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:57 localhost podman[321161]: 2026-02-01 09:57:57.488028703 +0000 UTC m=+0.183759342 container init cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3) Feb 1 04:57:57 localhost podman[321161]: 2026-02-01 09:57:57.497253198 +0000 UTC m=+0.192983817 container start cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:57:57 localhost dnsmasq[321179]: started, version 2.85 cachesize 150 Feb 1 04:57:57 localhost dnsmasq[321179]: DNS service limited to local subnets Feb 1 04:57:57 localhost dnsmasq[321179]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:57 localhost dnsmasq[321179]: warning: no upstream servers configured Feb 1 04:57:57 localhost dnsmasq[321179]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:57.713 259320 INFO neutron.agent.dhcp.agent [None req-e54e9895-8dd4-4af4-9aea-936db4ddf48e - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:57.726 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:57 localhost dnsmasq[321179]: exiting on receipt of SIGTERM Feb 1 04:57:57 localhost podman[321197]: 2026-02-01 09:57:57.806103407 +0000 UTC m=+0.061619286 container kill cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:57 localhost systemd[1]: libpod-cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9.scope: Deactivated successfully. Feb 1 04:57:57 localhost podman[321212]: 2026-02-01 09:57:57.879798753 +0000 UTC m=+0.057042215 container died cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:57:57 localhost podman[321212]: 2026-02-01 09:57:57.914817751 +0000 UTC m=+0.092061163 container cleanup cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:57:57 localhost systemd[1]: libpod-conmon-cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9.scope: Deactivated successfully. Feb 1 04:57:57 localhost podman[321213]: 2026-02-01 09:57:57.96551357 +0000 UTC m=+0.130827875 container remove cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:57:58 localhost systemd[1]: var-lib-containers-storage-overlay-051918edecd5449a6e38e088c20a7410199210141805246be3210d0f3d089b85-merged.mount: Deactivated successfully. Feb 1 04:57:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cda888792e317b4fe0e308c357120087d59c5dc72095f5887df7d4af5e5e73f9-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.291 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.292 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.292 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.293 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.293 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:57:59 localhost podman[321298]: Feb 1 04:57:59 localhost podman[321298]: 2026-02-01 09:57:59.4890496 +0000 UTC m=+0.080127986 container create c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 1 04:57:59 localhost systemd[1]: Started libpod-conmon-c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0.scope. Feb 1 04:57:59 localhost systemd[1]: Started libcrun container. Feb 1 04:57:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87700eab5c58c8b32d26ca85c7c686ae520a05c22e965ef08793bd6d95c468d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:57:59 localhost podman[321298]: 2026-02-01 09:57:59.549299202 +0000 UTC m=+0.140377568 container init c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:57:59 localhost podman[321298]: 2026-02-01 09:57:59.450578166 +0000 UTC m=+0.041656512 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:57:59 localhost podman[321298]: 2026-02-01 09:57:59.556069771 +0000 UTC m=+0.147148137 container start c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:59 localhost dnsmasq[321357]: started, version 2.85 cachesize 150 Feb 1 04:57:59 localhost dnsmasq[321357]: DNS service limited to local subnets Feb 1 04:57:59 localhost dnsmasq[321357]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:57:59 localhost dnsmasq[321357]: warning: no upstream servers configured Feb 1 04:57:59 localhost dnsmasq-dhcp[321357]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:57:59 localhost dnsmasq[321357]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:57:59 localhost dnsmasq-dhcp[321357]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:57:59 localhost dnsmasq-dhcp[321357]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:57:59 localhost dnsmasq[318665]: exiting on receipt of SIGTERM Feb 1 04:57:59 localhost podman[321340]: 2026-02-01 09:57:59.581367199 +0000 UTC m=+0.073727049 container kill b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:57:59 localhost systemd[1]: libpod-b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31.scope: Deactivated successfully. Feb 1 04:57:59 localhost podman[321362]: 2026-02-01 09:57:59.634351358 +0000 UTC m=+0.033841632 container died b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:59 localhost podman[321362]: 2026-02-01 09:57:59.665309161 +0000 UTC m=+0.064799455 container remove b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-206ee553-cd65-4f77-8129-196aa5aa2858, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:57:59 localhost systemd[1]: libpod-conmon-b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31.scope: Deactivated successfully. Feb 1 04:57:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:57:59 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1532095624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:57:59 localhost systemd[1]: var-lib-containers-storage-overlay-4974c31cf4b950157ff5f0f1f244fbde87227d444e5094de9dfb7675012ed750-merged.mount: Deactivated successfully. Feb 1 04:57:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b52c0faaa81a77a07ccdb972eab9d0bb3c0d8ab45af6cffd1f675d53b6ef9c31-userdata-shm.mount: Deactivated successfully. Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.818 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:57:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:59.869 259320 INFO neutron.agent.dhcp.agent [None req-4ccef589-ded8-4921-90f2-ab91de3d1b36 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.897 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:57:59 localhost nova_compute[274651]: 2026-02-01 09:57:59.898 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:57:59 localhost dnsmasq[321357]: exiting on receipt of SIGTERM Feb 1 04:57:59 localhost podman[321407]: 2026-02-01 09:57:59.918498008 +0000 UTC m=+0.071928184 container kill c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:57:59 localhost systemd[1]: tmp-crun.Utv69t.mount: Deactivated successfully. Feb 1 04:57:59 localhost systemd[1]: libpod-c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0.scope: Deactivated successfully. Feb 1 04:57:59 localhost systemd[1]: run-netns-qdhcp\x2d206ee553\x2dcd65\x2d4f77\x2d8129\x2d196aa5aa2858.mount: Deactivated successfully. Feb 1 04:57:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:57:59.965 259320 INFO neutron.agent.dhcp.agent [None req-65d260bd-770c-4116-bae1-90c2a32fdfd5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:57:59 localhost podman[321421]: 2026-02-01 09:57:59.995251708 +0000 UTC m=+0.055850338 container died c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:00 localhost podman[321421]: 2026-02-01 09:58:00.031669649 +0000 UTC m=+0.092268279 container remove c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:58:00 localhost systemd[1]: libpod-conmon-c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0.scope: Deactivated successfully. Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.088 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.090 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11252MB free_disk=41.77421951293945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.090 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.091 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:58:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:00.171 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.206 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.207 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.207 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.242 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.434 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:00.621 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:58:00 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/783780452' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.696 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.703 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.730 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.733 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:58:00 localhost nova_compute[274651]: 2026-02-01 09:58:00.734 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:58:00 localhost systemd[1]: var-lib-containers-storage-overlay-87700eab5c58c8b32d26ca85c7c686ae520a05c22e965ef08793bd6d95c468d0-merged.mount: Deactivated successfully. Feb 1 04:58:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c48a3ec1ebfd9341e934db0bdb1bc970e53dba2c5a98a8a90b3b0e7562e0d1e0-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:01 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:01.242 2 INFO neutron.agent.securitygroups_rpc [None req-9c92fd3b-2244-4a02-b891-f277532d3dc4 c808dfb9cb284e60ac814aa25eae5d58 3e1ea1a33e554968ba8ebaf6753c9c5d - - default default] Security group member updated ['7af9328f-e889-4487-9888-9c5f8b1745d9']#033[00m Feb 1 04:58:01 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:01.412 259320 INFO neutron.agent.linux.ip_lib [None req-88d498e4-45fd-412b-bb72-9a4fd495998c - - - - - -] Device tap1addee7e-ee cannot be used as it has no MAC address#033[00m Feb 1 04:58:01 localhost nova_compute[274651]: 2026-02-01 09:58:01.449 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:01 localhost kernel: device tap1addee7e-ee entered promiscuous mode Feb 1 04:58:01 localhost NetworkManager[5964]: [1769939881.4566] manager: (tap1addee7e-ee): new Generic device (/org/freedesktop/NetworkManager/Devices/66) Feb 1 04:58:01 localhost systemd-udevd[321507]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:01 localhost ovn_controller[152492]: 2026-02-01T09:58:01Z|00381|binding|INFO|Claiming lport 1addee7e-ee31-446c-9f2e-3e19811e2013 for this chassis. Feb 1 04:58:01 localhost ovn_controller[152492]: 2026-02-01T09:58:01Z|00382|binding|INFO|1addee7e-ee31-446c-9f2e-3e19811e2013: Claiming unknown Feb 1 04:58:01 localhost nova_compute[274651]: 2026-02-01 09:58:01.469 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:01.479 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe61:b0bd/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-7023075f-9d1b-4f40-a4e3-2cfa182f6205', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7023075f-9d1b-4f40-a4e3-2cfa182f6205', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '674a59d5810c453484339f60db55c64e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=60d29c20-04be-416a-b40c-2fd9418bff3a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1addee7e-ee31-446c-9f2e-3e19811e2013) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:01.480 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 1addee7e-ee31-446c-9f2e-3e19811e2013 in datapath 7023075f-9d1b-4f40-a4e3-2cfa182f6205 bound to our chassis#033[00m Feb 1 04:58:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:01.482 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7023075f-9d1b-4f40-a4e3-2cfa182f6205 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:01 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:01.483 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[fb32af84-2da7-4d43-9aff-df4ba7a1e890]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:01 localhost ovn_controller[152492]: 2026-02-01T09:58:01Z|00383|binding|INFO|Setting lport 1addee7e-ee31-446c-9f2e-3e19811e2013 ovn-installed in OVS Feb 1 04:58:01 localhost ovn_controller[152492]: 2026-02-01T09:58:01Z|00384|binding|INFO|Setting lport 1addee7e-ee31-446c-9f2e-3e19811e2013 up in Southbound Feb 1 04:58:01 localhost nova_compute[274651]: 2026-02-01 09:58:01.511 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:01 localhost openstack_network_exporter[239441]: ERROR 09:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:58:01 localhost openstack_network_exporter[239441]: Feb 1 04:58:01 localhost openstack_network_exporter[239441]: ERROR 09:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:58:01 localhost openstack_network_exporter[239441]: Feb 1 04:58:01 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:01.550 2 INFO neutron.agent.securitygroups_rpc [None req-eadca791-9490-47ec-9527-60ebe2a9b958 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:58:01 localhost nova_compute[274651]: 2026-02-01 09:58:01.558 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:01 localhost nova_compute[274651]: 2026-02-01 09:58:01.581 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e171 do_prune osdmap full prune enabled Feb 1 04:58:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e172 e172: 6 total, 6 up, 6 in Feb 1 04:58:01 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e172: 6 total, 6 up, 6 in Feb 1 04:58:01 localhost podman[321543]: Feb 1 04:58:01 localhost podman[321543]: 2026-02-01 09:58:01.792359673 +0000 UTC m=+0.087731350 container create bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:58:01 localhost systemd[1]: Started libpod-conmon-bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d.scope. Feb 1 04:58:01 localhost systemd[1]: Started libcrun container. Feb 1 04:58:01 localhost podman[321543]: 2026-02-01 09:58:01.755869711 +0000 UTC m=+0.051241438 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1e5c26e754246572fdc76ddda337c502c16cf1e90c38df52ad061393a611c83/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:01 localhost podman[321543]: 2026-02-01 09:58:01.86154267 +0000 UTC m=+0.156914347 container init bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:58:01 localhost systemd[1]: tmp-crun.Qg2AGm.mount: Deactivated successfully. Feb 1 04:58:01 localhost podman[321543]: 2026-02-01 09:58:01.87648128 +0000 UTC m=+0.171852957 container start bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:58:01 localhost dnsmasq[321571]: started, version 2.85 cachesize 150 Feb 1 04:58:01 localhost dnsmasq[321571]: DNS service limited to local subnets Feb 1 04:58:01 localhost dnsmasq[321571]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:01 localhost dnsmasq[321571]: warning: no upstream servers configured Feb 1 04:58:01 localhost dnsmasq-dhcp[321571]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:58:01 localhost dnsmasq-dhcp[321571]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:58:01 localhost dnsmasq[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:58:01 localhost dnsmasq-dhcp[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:58:01 localhost dnsmasq-dhcp[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:58:01 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:01.994 2 INFO neutron.agent.securitygroups_rpc [None req-825e2b67-68b3-4de9-af8c-c04099a8e61e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']#033[00m Feb 1 04:58:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:02.129 259320 INFO neutron.agent.dhcp.agent [None req-5b311fd1-f442-45db-ba98-69a5ede9ce13 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:58:02 localhost dnsmasq[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 0 addresses Feb 1 04:58:02 localhost dnsmasq-dhcp[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:58:02 localhost podman[321596]: 2026-02-01 09:58:02.236260826 +0000 UTC m=+0.064214896 container kill bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:58:02 localhost dnsmasq-dhcp[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:58:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:02.407 259320 INFO neutron.agent.dhcp.agent [None req-fba3909e-e669-409a-af37-16b7b5f030bd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:01Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=124b618e-1e8d-4046-ba46-65913a204e03, ip_allocation=immediate, mac_address=fa:16:3e:22:33:e7, name=tempest-NetworksTestDHCPv6-415659594, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:28Z, description=, dns_domain=, id=cba39058-6a05-4f77-add1-57334b728a66, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-2131762369, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49155, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=1596, status=ACTIVE, subnets=['9c2cdf9d-cfe2-42bc-9d6b-8df892179a19', 'bd88ef87-731d-46ff-8667-538b21bf747c'], tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:57:59Z, vlan_transparent=None, network_id=cba39058-6a05-4f77-add1-57334b728a66, port_security_enabled=True, project_id=fe5c9037c1c44846b3c840cd81d7f177, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['3438fec4-12ca-4b88-8e3d-decadab8f7bf'], standard_attr_id=2361, status=DOWN, tags=[], tenant_id=fe5c9037c1c44846b3c840cd81d7f177, updated_at=2026-02-01T09:58:01Z on network cba39058-6a05-4f77-add1-57334b728a66#033[00m Feb 1 04:58:02 localhost podman[321634]: Feb 1 04:58:02 localhost podman[321634]: 2026-02-01 09:58:02.431303105 +0000 UTC m=+0.100713239 container create 47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7023075f-9d1b-4f40-a4e3-2cfa182f6205, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:02 localhost systemd[1]: Started libpod-conmon-47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea.scope. Feb 1 04:58:02 localhost podman[321634]: 2026-02-01 09:58:02.3797962 +0000 UTC m=+0.049206394 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:58:02 localhost ovn_controller[152492]: 2026-02-01T09:58:02Z|00385|binding|INFO|Removing iface tap1addee7e-ee ovn-installed in OVS Feb 1 04:58:02 localhost ovn_controller[152492]: 2026-02-01T09:58:02Z|00386|binding|INFO|Removing lport 1addee7e-ee31-446c-9f2e-3e19811e2013 ovn-installed in OVS Feb 1 04:58:02 localhost nova_compute[274651]: 2026-02-01 09:58:02.485 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.489 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 452d457b-a9df-4d8f-844c-77c6c8e31c8f with type ""#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.489 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe61:b0bd/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-7023075f-9d1b-4f40-a4e3-2cfa182f6205', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7023075f-9d1b-4f40-a4e3-2cfa182f6205', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '674a59d5810c453484339f60db55c64e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=60d29c20-04be-416a-b40c-2fd9418bff3a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1addee7e-ee31-446c-9f2e-3e19811e2013) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.490 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 1addee7e-ee31-446c-9f2e-3e19811e2013 in datapath 7023075f-9d1b-4f40-a4e3-2cfa182f6205 unbound from our chassis#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.491 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 7023075f-9d1b-4f40-a4e3-2cfa182f6205 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:02 localhost nova_compute[274651]: 2026-02-01 09:58:02.494 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.495 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ed69f4b5-99ab-4c78-8428-39108d017bc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:02 localhost systemd[1]: Started libcrun container. Feb 1 04:58:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b65e442533965d4d9c0f5493f1a0361196906e47383760982a3074804ab3e7e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:02 localhost podman[321634]: 2026-02-01 09:58:02.525253484 +0000 UTC m=+0.194663658 container init 47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7023075f-9d1b-4f40-a4e3-2cfa182f6205, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:58:02 localhost podman[321634]: 2026-02-01 09:58:02.536657065 +0000 UTC m=+0.206067199 container start 47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7023075f-9d1b-4f40-a4e3-2cfa182f6205, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:02 localhost dnsmasq[321681]: started, version 2.85 cachesize 150 Feb 1 04:58:02 localhost dnsmasq[321681]: DNS service limited to local subnets Feb 1 04:58:02 localhost dnsmasq[321681]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:02 localhost dnsmasq[321681]: warning: no upstream servers configured Feb 1 04:58:02 localhost dnsmasq-dhcp[321681]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:58:02 localhost dnsmasq[321681]: read /var/lib/neutron/dhcp/7023075f-9d1b-4f40-a4e3-2cfa182f6205/addn_hosts - 0 addresses Feb 1 04:58:02 localhost dnsmasq-dhcp[321681]: read /var/lib/neutron/dhcp/7023075f-9d1b-4f40-a4e3-2cfa182f6205/host Feb 1 04:58:02 localhost dnsmasq-dhcp[321681]: read /var/lib/neutron/dhcp/7023075f-9d1b-4f40-a4e3-2cfa182f6205/opts Feb 1 04:58:02 localhost podman[321659]: 2026-02-01 09:58:02.616870782 +0000 UTC m=+0.125609394 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:58:02 localhost podman[321659]: 2026-02-01 09:58:02.628278423 +0000 UTC m=+0.137016995 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:58:02 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.660 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:be:37 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b17a2b9-5e93-4788-90e6-3eea4883a111, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a21a9b5e-c616-4953-aa12-b45630ee9601) old=Port_Binding(mac=['fa:16:3e:f3:be:37 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.662 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a21a9b5e-c616-4953-aa12-b45630ee9601 in datapath f90b2d3c-17ac-4074-8e52-3a58738705b1 updated#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.665 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f90b2d3c-17ac-4074-8e52-3a58738705b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:02 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:02.666 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f453c010-60eb-4460-91fb-e3bb8da4dc8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:02 localhost dnsmasq[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/addn_hosts - 2 addresses Feb 1 04:58:02 localhost podman[321687]: 2026-02-01 09:58:02.67563997 +0000 UTC m=+0.071089217 container kill bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:58:02 localhost dnsmasq-dhcp[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/host Feb 1 04:58:02 localhost dnsmasq-dhcp[321571]: read /var/lib/neutron/dhcp/cba39058-6a05-4f77-add1-57334b728a66/opts Feb 1 04:58:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:02.702 259320 INFO neutron.agent.dhcp.agent [None req-5f064b32-f25b-4eb3-8379-e91c7164f905 - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:58:02 localhost ovn_controller[152492]: 2026-02-01T09:58:02Z|00387|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:02.816 259320 INFO neutron.agent.dhcp.agent [None req-c77f5993-cb41-49a0-96ca-270fdc16a63c - - - - - -] DHCP configuration for ports {'c3768957-4a34-412e-a89b-703ca8a25910'} is completed#033[00m Feb 1 04:58:02 localhost nova_compute[274651]: 2026-02-01 09:58:02.843 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:02 localhost dnsmasq[321681]: exiting on receipt of SIGTERM Feb 1 04:58:02 localhost podman[321734]: 2026-02-01 09:58:02.910129733 +0000 UTC m=+0.046599784 container kill 47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7023075f-9d1b-4f40-a4e3-2cfa182f6205, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:02 localhost systemd[1]: libpod-47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea.scope: Deactivated successfully. Feb 1 04:58:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:02.940 259320 INFO neutron.agent.dhcp.agent [None req-721e77c0-4d73-45e3-8410-afc12af76afd - - - - - -] DHCP configuration for ports {'124b618e-1e8d-4046-ba46-65913a204e03'} is completed#033[00m Feb 1 04:58:02 localhost podman[321749]: 2026-02-01 09:58:02.967467046 +0000 UTC m=+0.037373490 container died 47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7023075f-9d1b-4f40-a4e3-2cfa182f6205, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:58:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:03 localhost podman[321749]: 2026-02-01 09:58:03.071564557 +0000 UTC m=+0.141471011 container remove 47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7023075f-9d1b-4f40-a4e3-2cfa182f6205, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:03 localhost systemd[1]: libpod-conmon-47b0a747285073fb807e4c89062359d199fd834a40dbf0f93dded6d2c57b57ea.scope: Deactivated successfully. Feb 1 04:58:03 localhost nova_compute[274651]: 2026-02-01 09:58:03.088 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:03 localhost kernel: device tap1addee7e-ee left promiscuous mode Feb 1 04:58:03 localhost nova_compute[274651]: 2026-02-01 09:58:03.108 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:03.124 259320 INFO neutron.agent.dhcp.agent [None req-bc7a5da7-56a1-4603-b458-40b38729ab02 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:03.125 259320 INFO neutron.agent.dhcp.agent [None req-bc7a5da7-56a1-4603-b458-40b38729ab02 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:03 localhost dnsmasq[321571]: exiting on receipt of SIGTERM Feb 1 04:58:03 localhost podman[321790]: 2026-02-01 09:58:03.139630442 +0000 UTC m=+0.068676174 container kill bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:58:03 localhost systemd[1]: libpod-bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d.scope: Deactivated successfully. Feb 1 04:58:03 localhost podman[321801]: 2026-02-01 09:58:03.214433912 +0000 UTC m=+0.059928634 container died bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:03 localhost podman[321801]: 2026-02-01 09:58:03.244920659 +0000 UTC m=+0.090415351 container cleanup bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:58:03 localhost systemd[1]: libpod-conmon-bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d.scope: Deactivated successfully. Feb 1 04:58:03 localhost podman[321803]: 2026-02-01 09:58:03.29566367 +0000 UTC m=+0.134689384 container remove bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cba39058-6a05-4f77-add1-57334b728a66, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:58:03 localhost nova_compute[274651]: 2026-02-01 09:58:03.521 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:03 localhost ovn_controller[152492]: 2026-02-01T09:58:03Z|00388|binding|INFO|Releasing lport 390b69cd-dd37-4979-8a69-c659caca50f4 from this chassis (sb_readonly=0) Feb 1 04:58:03 localhost kernel: device tap390b69cd-dd left promiscuous mode Feb 1 04:58:03 localhost ovn_controller[152492]: 2026-02-01T09:58:03Z|00389|binding|INFO|Setting lport 390b69cd-dd37-4979-8a69-c659caca50f4 down in Southbound Feb 1 04:58:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:03.532 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6c:10d2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=390b69cd-dd37-4979-8a69-c659caca50f4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:03.534 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 390b69cd-dd37-4979-8a69-c659caca50f4 in datapath cba39058-6a05-4f77-add1-57334b728a66 unbound from our chassis#033[00m Feb 1 04:58:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:03.537 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:03 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:03.538 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[e31da3ce-e5fd-4405-8407-a0270a775220]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:03 localhost nova_compute[274651]: 2026-02-01 09:58:03.549 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:03 localhost nova_compute[274651]: 2026-02-01 09:58:03.551 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:03.563 259320 INFO neutron.agent.dhcp.agent [None req-e4435c2b-8ef4-4306-b841-6e68fd498f6e - - - - - -] DHCP configuration for ports {'390b69cd-dd37-4979-8a69-c659caca50f4', 'd4bc4012-7c81-4a7f-9a67-f9545d549873'} is completed#033[00m Feb 1 04:58:03 localhost nova_compute[274651]: 2026-02-01 09:58:03.731 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e172 do_prune osdmap full prune enabled Feb 1 04:58:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e173 e173: 6 total, 6 up, 6 in Feb 1 04:58:03 localhost nova_compute[274651]: 2026-02-01 09:58:03.750 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:03 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e173: 6 total, 6 up, 6 in Feb 1 04:58:03 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:03.767 2 INFO neutron.agent.securitygroups_rpc [None req-6f36bda7-f8b8-46c7-a4c3-95b983979dc7 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']#033[00m Feb 1 04:58:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:03.799 259320 INFO neutron.agent.dhcp.agent [None req-0cda149a-d596-4486-903b-ee2a8e580c60 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:03.799 259320 INFO neutron.agent.dhcp.agent [None req-0cda149a-d596-4486-903b-ee2a8e580c60 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:03 localhost systemd[1]: var-lib-containers-storage-overlay-1b65e442533965d4d9c0f5493f1a0361196906e47383760982a3074804ab3e7e-merged.mount: Deactivated successfully. Feb 1 04:58:03 localhost systemd[1]: var-lib-containers-storage-overlay-c1e5c26e754246572fdc76ddda337c502c16cf1e90c38df52ad061393a611c83-merged.mount: Deactivated successfully. Feb 1 04:58:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfc3162cd3021f011a9de12c09abaae29ddd55c51b39f40b30bc1caae9c4aa3d-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:03 localhost systemd[1]: run-netns-qdhcp\x2d7023075f\x2d9d1b\x2d4f40\x2da4e3\x2d2cfa182f6205.mount: Deactivated successfully. Feb 1 04:58:03 localhost systemd[1]: run-netns-qdhcp\x2dcba39058\x2d6a05\x2d4f77\x2dadd1\x2d57334b728a66.mount: Deactivated successfully. Feb 1 04:58:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:04.021 2 INFO neutron.agent.securitygroups_rpc [None req-752af62a-19da-4b3f-a3e8-9a1412f9f50e 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e173 do_prune osdmap full prune enabled Feb 1 04:58:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e174 e174: 6 total, 6 up, 6 in Feb 1 04:58:04 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e174: 6 total, 6 up, 6 in Feb 1 04:58:04 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:04.929 259320 INFO neutron.agent.linux.ip_lib [None req-3dcaa2ee-b97c-406f-88b5-4fbc09501aa6 - - - - - -] Device tapcc2645da-15 cannot be used as it has no MAC address#033[00m Feb 1 04:58:04 localhost nova_compute[274651]: 2026-02-01 09:58:04.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:04.998 2 INFO neutron.agent.securitygroups_rpc [None req-dcf8ea11-57cc-44a8-b32b-d084e8cc9746 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:05 localhost kernel: device tapcc2645da-15 entered promiscuous mode Feb 1 04:58:05 localhost NetworkManager[5964]: [1769939885.0061] manager: (tapcc2645da-15): new Generic device (/org/freedesktop/NetworkManager/Devices/67) Feb 1 04:58:05 localhost ovn_controller[152492]: 2026-02-01T09:58:05Z|00390|binding|INFO|Claiming lport cc2645da-1526-400b-a11b-14c2d68c2597 for this chassis. Feb 1 04:58:05 localhost ovn_controller[152492]: 2026-02-01T09:58:05Z|00391|binding|INFO|cc2645da-1526-400b-a11b-14c2d68c2597: Claiming unknown Feb 1 04:58:05 localhost systemd-udevd[321840]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:05 localhost nova_compute[274651]: 2026-02-01 09:58:05.012 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:05 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:05.022 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acfd0302-fb91-4776-9c9f-52873cc3b466, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc2645da-1526-400b-a11b-14c2d68c2597) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:05 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:05.025 158365 INFO neutron.agent.ovn.metadata.agent [-] Port cc2645da-1526-400b-a11b-14c2d68c2597 in datapath b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9 bound to our chassis#033[00m Feb 1 04:58:05 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:05.026 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:05 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:05.027 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[88ab0063-6d8e-4383-a0c9-fdc63665c45b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost ovn_controller[152492]: 2026-02-01T09:58:05Z|00392|binding|INFO|Setting lport cc2645da-1526-400b-a11b-14c2d68c2597 ovn-installed in OVS Feb 1 04:58:05 localhost ovn_controller[152492]: 2026-02-01T09:58:05Z|00393|binding|INFO|Setting lport cc2645da-1526-400b-a11b-14c2d68c2597 up in Southbound Feb 1 04:58:05 localhost nova_compute[274651]: 2026-02-01 09:58:05.037 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:05.040 2 INFO neutron.agent.securitygroups_rpc [None req-6f8cb305-6336-427d-a0d9-37ed7bed8449 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']#033[00m Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost journal[217584]: ethtool ioctl error on tapcc2645da-15: No such device Feb 1 04:58:05 localhost nova_compute[274651]: 2026-02-01 09:58:05.072 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:05 localhost nova_compute[274651]: 2026-02-01 09:58:05.101 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:58:05 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:58:05 localhost nova_compute[274651]: 2026-02-01 09:58:05.437 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e174 do_prune osdmap full prune enabled Feb 1 04:58:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e175 e175: 6 total, 6 up, 6 in Feb 1 04:58:05 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e59: np0005604215.uhhqtv(active, since 8m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh Feb 1 04:58:05 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e175: 6 total, 6 up, 6 in Feb 1 04:58:05 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:05.875 2 INFO neutron.agent.securitygroups_rpc [None req-cf4e6267-50f2-41a1-bdc4-48a2e39e61cf 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:06 localhost podman[321910]: Feb 1 04:58:06 localhost podman[321910]: 2026-02-01 09:58:06.047313324 +0000 UTC m=+0.096968044 container create 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:58:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:58:06 localhost podman[321910]: 2026-02-01 09:58:06.004810296 +0000 UTC m=+0.054465056 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:06 localhost systemd[1]: Started libpod-conmon-5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f.scope. Feb 1 04:58:06 localhost systemd[1]: Started libcrun container. Feb 1 04:58:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a020d34d6f9e5fcce55f6ff250f64679f6b1dc3007eedd71d4a67ee47e78444/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:06 localhost podman[321910]: 2026-02-01 09:58:06.137285261 +0000 UTC m=+0.186939971 container init 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:58:06 localhost podman[321910]: 2026-02-01 09:58:06.146872386 +0000 UTC m=+0.196527096 container start 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:06 localhost dnsmasq[321938]: started, version 2.85 cachesize 150 Feb 1 04:58:06 localhost dnsmasq[321938]: DNS service limited to local subnets Feb 1 04:58:06 localhost dnsmasq[321938]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:06 localhost dnsmasq[321938]: warning: no upstream servers configured Feb 1 04:58:06 localhost dnsmasq-dhcp[321938]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:58:06 localhost dnsmasq[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/addn_hosts - 0 addresses Feb 1 04:58:06 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/host Feb 1 04:58:06 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/opts Feb 1 04:58:06 localhost podman[321923]: 2026-02-01 09:58:06.226176636 +0000 UTC m=+0.143631880 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 04:58:06 localhost podman[321923]: 2026-02-01 09:58:06.260327766 +0000 UTC m=+0.177782950 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:06 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:58:06 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:06.324 259320 INFO neutron.agent.dhcp.agent [None req-28091f04-9808-41a6-b58b-ab4bf7a2bfb2 - - - - - -] DHCP configuration for ports {'968c45e3-8f5d-459e-8753-4f716c16aed5'} is completed#033[00m Feb 1 04:58:06 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:06.360 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:06 localhost ovn_controller[152492]: 2026-02-01T09:58:06Z|00394|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:06 localhost nova_compute[274651]: 2026-02-01 09:58:06.724 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:06 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:06.799 2 INFO neutron.agent.securitygroups_rpc [None req-b303317c-4287-410b-804b-7e395b86e859 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:07.405 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:06Z, description=, device_id=92bb5c83-dc88-48ea-923f-a2455fa17d5c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f4ed6683-133b-4535-b1a4-9c41a9f9e89d, ip_allocation=immediate, mac_address=fa:16:3e:39:bb:ed, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:03Z, description=, dns_domain=, id=b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1651485707, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46975, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2363, status=ACTIVE, subnets=['e539a1bd-b6a4-4995-a5b2-d55c1b14a360'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:04Z, vlan_transparent=None, network_id=b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2375, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:07Z on network b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9#033[00m Feb 1 04:58:07 localhost systemd[1]: tmp-crun.EKC0bI.mount: Deactivated successfully. Feb 1 04:58:07 localhost dnsmasq[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/addn_hosts - 1 addresses Feb 1 04:58:07 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/host Feb 1 04:58:07 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/opts Feb 1 04:58:07 localhost podman[321959]: 2026-02-01 09:58:07.605104877 +0000 UTC m=+0.061726429 container kill 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:58:07 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Feb 1 04:58:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e175 do_prune osdmap full prune enabled Feb 1 04:58:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e176 e176: 6 total, 6 up, 6 in Feb 1 04:58:07 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e176: 6 total, 6 up, 6 in Feb 1 04:58:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:07.876 259320 INFO neutron.agent.dhcp.agent [None req-f48d2ea0-31e6-4397-aedf-d480595d395a - - - - - -] DHCP configuration for ports {'f4ed6683-133b-4535-b1a4-9c41a9f9e89d'} is completed#033[00m Feb 1 04:58:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e176 do_prune osdmap full prune enabled Feb 1 04:58:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:08.886 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:06Z, description=, device_id=92bb5c83-dc88-48ea-923f-a2455fa17d5c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f4ed6683-133b-4535-b1a4-9c41a9f9e89d, ip_allocation=immediate, mac_address=fa:16:3e:39:bb:ed, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:03Z, description=, dns_domain=, id=b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1651485707, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46975, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2363, status=ACTIVE, subnets=['e539a1bd-b6a4-4995-a5b2-d55c1b14a360'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:04Z, vlan_transparent=None, network_id=b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2375, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:07Z on network b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9#033[00m Feb 1 04:58:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e177 e177: 6 total, 6 up, 6 in Feb 1 04:58:08 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e177: 6 total, 6 up, 6 in Feb 1 04:58:09 localhost dnsmasq[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/addn_hosts - 1 addresses Feb 1 04:58:09 localhost podman[321998]: 2026-02-01 09:58:09.093769514 +0000 UTC m=+0.058912013 container kill 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:58:09 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/host Feb 1 04:58:09 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/opts Feb 1 04:58:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:09.329 259320 INFO neutron.agent.dhcp.agent [None req-267fbbe0-1056-4b47-a7f2-024df94ca676 - - - - - -] DHCP configuration for ports {'f4ed6683-133b-4535-b1a4-9c41a9f9e89d'} is completed#033[00m Feb 1 04:58:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e177 do_prune osdmap full prune enabled Feb 1 04:58:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e178 e178: 6 total, 6 up, 6 in Feb 1 04:58:09 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e60: np0005604215.uhhqtv(active, since 8m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh Feb 1 04:58:09 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e178: 6 total, 6 up, 6 in Feb 1 04:58:10 localhost nova_compute[274651]: 2026-02-01 09:58:10.496 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:10 localhost dnsmasq[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/addn_hosts - 0 addresses Feb 1 04:58:10 localhost podman[322035]: 2026-02-01 09:58:10.570308118 +0000 UTC m=+0.063111541 container kill 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:10 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/host Feb 1 04:58:10 localhost dnsmasq-dhcp[321938]: read /var/lib/neutron/dhcp/b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9/opts Feb 1 04:58:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:58:10 localhost systemd[1]: tmp-crun.6GMhmk.mount: Deactivated successfully. Feb 1 04:58:10 localhost podman[322048]: 2026-02-01 09:58:10.704080523 +0000 UTC m=+0.107127226 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:58:10 localhost podman[322048]: 2026-02-01 09:58:10.710936153 +0000 UTC m=+0.113982816 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:58:10 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:58:10 localhost kernel: device tapcc2645da-15 left promiscuous mode Feb 1 04:58:10 localhost nova_compute[274651]: 2026-02-01 09:58:10.778 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:10 localhost ovn_controller[152492]: 2026-02-01T09:58:10Z|00395|binding|INFO|Releasing lport cc2645da-1526-400b-a11b-14c2d68c2597 from this chassis (sb_readonly=0) Feb 1 04:58:10 localhost ovn_controller[152492]: 2026-02-01T09:58:10Z|00396|binding|INFO|Setting lport cc2645da-1526-400b-a11b-14c2d68c2597 down in Southbound Feb 1 04:58:10 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:10.790 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=acfd0302-fb91-4776-9c9f-52873cc3b466, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc2645da-1526-400b-a11b-14c2d68c2597) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:10 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:10.791 158365 INFO neutron.agent.ovn.metadata.agent [-] Port cc2645da-1526-400b-a11b-14c2d68c2597 in datapath b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9 unbound from our chassis#033[00m Feb 1 04:58:10 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:10.793 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:10 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:10.798 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f6d8f5cf-49ba-48ee-a3f8-d5c2e8553b72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:10 localhost nova_compute[274651]: 2026-02-01 09:58:10.804 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:11 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:11.274 2 INFO neutron.agent.securitygroups_rpc [None req-2e636825-6352-4de3-92a6-2082180ce0f9 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:11 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:11.436 2 INFO neutron.agent.securitygroups_rpc [None req-0b3ddafe-7058-48e9-ade9-6122faaa4a98 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:11 localhost dnsmasq[321938]: exiting on receipt of SIGTERM Feb 1 04:58:11 localhost podman[322096]: 2026-02-01 09:58:11.478829322 +0000 UTC m=+0.036956478 container kill 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:58:11 localhost systemd[1]: libpod-5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f.scope: Deactivated successfully. Feb 1 04:58:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:58:11 localhost podman[322108]: 2026-02-01 09:58:11.548345269 +0000 UTC m=+0.059297784 container died 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:11 localhost podman[322116]: 2026-02-01 09:58:11.588693941 +0000 UTC m=+0.081496247 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 04:58:11 localhost podman[322116]: 2026-02-01 09:58:11.621123168 +0000 UTC m=+0.113925474 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS) Feb 1 04:58:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:11 localhost systemd[1]: var-lib-containers-storage-overlay-2a020d34d6f9e5fcce55f6ff250f64679f6b1dc3007eedd71d4a67ee47e78444-merged.mount: Deactivated successfully. Feb 1 04:58:11 localhost podman[322108]: 2026-02-01 09:58:11.63191073 +0000 UTC m=+0.142863235 container cleanup 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:11 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:58:11 localhost systemd[1]: libpod-conmon-5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f.scope: Deactivated successfully. Feb 1 04:58:11 localhost podman[322115]: 2026-02-01 09:58:11.70181918 +0000 UTC m=+0.196705071 container remove 5448f7e878b2d6a55ae30f9e0a670e2e852a0af5400febe251f2b2546dbd838f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b2a3ca0a-9a42-4279-bfb7-8b7b74af84c9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:58:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:11 localhost systemd[1]: run-netns-qdhcp\x2db2a3ca0a\x2d9a42\x2d4279\x2dbfb7\x2d8b7b74af84c9.mount: Deactivated successfully. Feb 1 04:58:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:11.727 259320 INFO neutron.agent.dhcp.agent [None req-9a52ee24-0c27-49de-95e7-ab066706ac9e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:11.877 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:12.021 2 INFO neutron.agent.securitygroups_rpc [None req-4918b331-88ae-4de3-8570-b8490451d4d3 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:12 localhost ovn_controller[152492]: 2026-02-01T09:58:12Z|00397|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:12 localhost nova_compute[274651]: 2026-02-01 09:58:12.233 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:12.837 2 INFO neutron.agent.securitygroups_rpc [None req-c4f766e3-6d2a-4be1-a25c-795440958939 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e178 do_prune osdmap full prune enabled Feb 1 04:58:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e179 e179: 6 total, 6 up, 6 in Feb 1 04:58:12 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e179: 6 total, 6 up, 6 in Feb 1 04:58:13 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:13.345 2 INFO neutron.agent.securitygroups_rpc [None req-07f0d5e0-07b1-4081-9883-e9d75a91fc18 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:13 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:13.610 2 INFO neutron.agent.securitygroups_rpc [None req-07f0d5e0-07b1-4081-9883-e9d75a91fc18 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:13 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:13.856 2 INFO neutron.agent.securitygroups_rpc [None req-b64013eb-118b-4ee2-9fc4-6cc3b98c578e 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:14 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:14.498 2 INFO neutron.agent.securitygroups_rpc [None req-ff9a1527-b1c3-4b84-beeb-30758949e010 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e179 do_prune osdmap full prune enabled Feb 1 04:58:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e180 e180: 6 total, 6 up, 6 in Feb 1 04:58:15 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e180: 6 total, 6 up, 6 in Feb 1 04:58:15 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:15.305 2 INFO neutron.agent.securitygroups_rpc [None req-dea63ea4-98ed-4833-b1d5-beae69081804 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:15 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:15.353 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:15 localhost nova_compute[274651]: 2026-02-01 09:58:15.526 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e180 do_prune osdmap full prune enabled Feb 1 04:58:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e181 e181: 6 total, 6 up, 6 in Feb 1 04:58:16 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e181: 6 total, 6 up, 6 in Feb 1 04:58:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:58:17 localhost podman[322162]: 2026-02-01 09:58:17.726256426 +0000 UTC m=+0.084671286 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1769056855, managed_by=edpm_ansible, container_name=openstack_network_exporter) Feb 1 04:58:17 localhost podman[322162]: 2026-02-01 09:58:17.770466355 +0000 UTC m=+0.128881205 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=9.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1769056855) Feb 1 04:58:17 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:58:17 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:17.833 259320 INFO neutron.agent.linux.ip_lib [None req-0ecb70ce-b33b-4548-8360-467f6f0067ad - - - - - -] Device tap512991a4-a1 cannot be used as it has no MAC address#033[00m Feb 1 04:58:17 localhost nova_compute[274651]: 2026-02-01 09:58:17.891 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:17.893 2 INFO neutron.agent.securitygroups_rpc [None req-38946330-c139-4cbe-adf2-c14c00d51ca6 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:17 localhost kernel: device tap512991a4-a1 entered promiscuous mode Feb 1 04:58:17 localhost NetworkManager[5964]: [1769939897.9005] manager: (tap512991a4-a1): new Generic device (/org/freedesktop/NetworkManager/Devices/68) Feb 1 04:58:17 localhost nova_compute[274651]: 2026-02-01 09:58:17.899 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:17 localhost ovn_controller[152492]: 2026-02-01T09:58:17Z|00398|binding|INFO|Claiming lport 512991a4-a1f2-414a-8f9c-b8d2ef4dc84e for this chassis. Feb 1 04:58:17 localhost ovn_controller[152492]: 2026-02-01T09:58:17Z|00399|binding|INFO|512991a4-a1f2-414a-8f9c-b8d2ef4dc84e: Claiming unknown Feb 1 04:58:17 localhost systemd-udevd[322192]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost ovn_controller[152492]: 2026-02-01T09:58:17Z|00400|binding|INFO|Setting lport 512991a4-a1f2-414a-8f9c-b8d2ef4dc84e ovn-installed in OVS Feb 1 04:58:17 localhost nova_compute[274651]: 2026-02-01 09:58:17.929 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost ovn_controller[152492]: 2026-02-01T09:58:17Z|00401|binding|INFO|Setting lport 512991a4-a1f2-414a-8f9c-b8d2ef4dc84e up in Southbound Feb 1 04:58:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:17.959 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-33888865-dbfe-4079-ba91-4b4c7cf03f1e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33888865-dbfe-4079-ba91-4b4c7cf03f1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=25e36a02-c310-4fbc-a1ea-afe6b40009b3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=512991a4-a1f2-414a-8f9c-b8d2ef4dc84e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:17.960 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 512991a4-a1f2-414a-8f9c-b8d2ef4dc84e in datapath 33888865-dbfe-4079-ba91-4b4c7cf03f1e bound to our chassis#033[00m Feb 1 04:58:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:17.961 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 74327aab-1985-4357-b83e-a97e050ed6ac IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:58:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:17.961 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33888865-dbfe-4079-ba91-4b4c7cf03f1e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:17 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:17.962 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[35402875-cbff-4761-8269-e0c1a6772c7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:17 localhost journal[217584]: ethtool ioctl error on tap512991a4-a1: No such device Feb 1 04:58:17 localhost nova_compute[274651]: 2026-02-01 09:58:17.969 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:18 localhost nova_compute[274651]: 2026-02-01 09:58:18.003 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e181 do_prune osdmap full prune enabled Feb 1 04:58:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e182 e182: 6 total, 6 up, 6 in Feb 1 04:58:18 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e182: 6 total, 6 up, 6 in Feb 1 04:58:18 localhost podman[322263]: Feb 1 04:58:19 localhost podman[322263]: 2026-02-01 09:58:19.008289997 +0000 UTC m=+0.098333815 container create f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:58:19 localhost systemd[1]: Started libpod-conmon-f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15.scope. Feb 1 04:58:19 localhost podman[322263]: 2026-02-01 09:58:18.962048205 +0000 UTC m=+0.052092063 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:19 localhost systemd[1]: Started libcrun container. Feb 1 04:58:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84a726aca79861c1874f2f31635f79b4ec4e5b993185268b98342eaabaf2445a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:19 localhost podman[322263]: 2026-02-01 09:58:19.098594515 +0000 UTC m=+0.188638323 container init f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:19 localhost podman[322263]: 2026-02-01 09:58:19.106150797 +0000 UTC m=+0.196194615 container start f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:58:19 localhost dnsmasq[322281]: started, version 2.85 cachesize 150 Feb 1 04:58:19 localhost dnsmasq[322281]: DNS service limited to local subnets Feb 1 04:58:19 localhost dnsmasq[322281]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:19 localhost dnsmasq[322281]: warning: no upstream servers configured Feb 1 04:58:19 localhost dnsmasq-dhcp[322281]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:58:19 localhost dnsmasq[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/addn_hosts - 0 addresses Feb 1 04:58:19 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/host Feb 1 04:58:19 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/opts Feb 1 04:58:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:19.176 259320 INFO neutron.agent.dhcp.agent [None req-43f46ac8-eeb2-4abf-990e-840167440d77 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:17Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1ca3d906-4fb0-4890-85d2-1327968978e8, ip_allocation=immediate, mac_address=fa:16:3e:fa:54:66, name=tempest-PortsTestJSON-1601369144, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:15Z, description=, dns_domain=, id=33888865-dbfe-4079-ba91-4b4c7cf03f1e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1466538445, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62593, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2431, status=ACTIVE, subnets=['899c4b57-96f6-4925-986b-8b902cc8fc25'], tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:15Z, vlan_transparent=None, network_id=33888865-dbfe-4079-ba91-4b4c7cf03f1e, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['277d73b7-d267-437d-b5df-bd560d180a7a'], standard_attr_id=2457, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:17Z on network 33888865-dbfe-4079-ba91-4b4c7cf03f1e#033[00m Feb 1 04:58:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:19.281 259320 INFO neutron.agent.dhcp.agent [None req-61553430-03ae-4a5e-ba3f-18c6d61c4c5d - - - - - -] DHCP configuration for ports {'8d5a5032-66c5-4e35-878b-aa8cab8e91fd'} is completed#033[00m Feb 1 04:58:19 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:19.408 2 INFO neutron.agent.securitygroups_rpc [None req-dc033b5b-d88f-40f8-9b27-1242de33844a d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:19 localhost dnsmasq[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/addn_hosts - 1 addresses Feb 1 04:58:19 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/host Feb 1 04:58:19 localhost podman[322299]: 2026-02-01 09:58:19.432540536 +0000 UTC m=+0.067416034 container kill f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:58:19 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/opts Feb 1 04:58:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:19.747 259320 INFO neutron.agent.dhcp.agent [None req-bd6c89ee-ee5c-4a22-953d-c6fafe73e590 - - - - - -] DHCP configuration for ports {'1ca3d906-4fb0-4890-85d2-1327968978e8'} is completed#033[00m Feb 1 04:58:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:19.800 259320 INFO neutron.agent.linux.ip_lib [None req-529f88ea-339b-412b-8dc8-425dc71e3ab7 - - - - - -] Device tap9cffcb83-bd cannot be used as it has no MAC address#033[00m Feb 1 04:58:19 localhost nova_compute[274651]: 2026-02-01 09:58:19.826 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:19 localhost kernel: device tap9cffcb83-bd entered promiscuous mode Feb 1 04:58:19 localhost systemd-udevd[322194]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:19 localhost nova_compute[274651]: 2026-02-01 09:58:19.833 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:19 localhost NetworkManager[5964]: [1769939899.8338] manager: (tap9cffcb83-bd): new Generic device (/org/freedesktop/NetworkManager/Devices/69) Feb 1 04:58:19 localhost ovn_controller[152492]: 2026-02-01T09:58:19Z|00402|binding|INFO|Claiming lport 9cffcb83-bd08-487f-bdb0-c3f67dc8f850 for this chassis. Feb 1 04:58:19 localhost ovn_controller[152492]: 2026-02-01T09:58:19Z|00403|binding|INFO|9cffcb83-bd08-487f-bdb0-c3f67dc8f850: Claiming unknown Feb 1 04:58:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:19.848 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-75d12671-b626-4593-bbf3-e0e6d78cb68e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75d12671-b626-4593-bbf3-e0e6d78cb68e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c4191b-8d66-4b0b-81a5-876a8bdd10e7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9cffcb83-bd08-487f-bdb0-c3f67dc8f850) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:19.850 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 9cffcb83-bd08-487f-bdb0-c3f67dc8f850 in datapath 75d12671-b626-4593-bbf3-e0e6d78cb68e bound to our chassis#033[00m Feb 1 04:58:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:19.851 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 75d12671-b626-4593-bbf3-e0e6d78cb68e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:19.853 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[0285e575-77b9-45c4-9a61-49dc606e2b4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:19 localhost nova_compute[274651]: 2026-02-01 09:58:19.869 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:19 localhost ovn_controller[152492]: 2026-02-01T09:58:19Z|00404|binding|INFO|Setting lport 9cffcb83-bd08-487f-bdb0-c3f67dc8f850 ovn-installed in OVS Feb 1 04:58:19 localhost ovn_controller[152492]: 2026-02-01T09:58:19Z|00405|binding|INFO|Setting lport 9cffcb83-bd08-487f-bdb0-c3f67dc8f850 up in Southbound Feb 1 04:58:19 localhost nova_compute[274651]: 2026-02-01 09:58:19.871 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:19 localhost nova_compute[274651]: 2026-02-01 09:58:19.874 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:19 localhost nova_compute[274651]: 2026-02-01 09:58:19.910 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:19 localhost nova_compute[274651]: 2026-02-01 09:58:19.941 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:20 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:20.021 2 INFO neutron.agent.securitygroups_rpc [None req-26640e7d-8ddf-4cc3-b0ea-945d98bbd76e d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e182 do_prune osdmap full prune enabled Feb 1 04:58:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e183 e183: 6 total, 6 up, 6 in Feb 1 04:58:20 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e183: 6 total, 6 up, 6 in Feb 1 04:58:20 localhost nova_compute[274651]: 2026-02-01 09:58:20.570 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:20 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:20.871 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:17Z, description=, device_id=1943267a-fa18-43f0-89cd-898c454b2dee, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1ca3d906-4fb0-4890-85d2-1327968978e8, ip_allocation=immediate, mac_address=fa:16:3e:fa:54:66, name=tempest-PortsTestJSON-1601369144, network_id=33888865-dbfe-4079-ba91-4b4c7cf03f1e, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['277d73b7-d267-437d-b5df-bd560d180a7a'], standard_attr_id=2457, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:18Z on network 33888865-dbfe-4079-ba91-4b4c7cf03f1e#033[00m Feb 1 04:58:20 localhost podman[322382]: Feb 1 04:58:20 localhost podman[322382]: 2026-02-01 09:58:20.947296685 +0000 UTC m=+0.096691924 container create 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:58:20 localhost systemd[1]: Started libpod-conmon-45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39.scope. Feb 1 04:58:20 localhost systemd[1]: Started libcrun container. Feb 1 04:58:21 localhost podman[322382]: 2026-02-01 09:58:20.900057273 +0000 UTC m=+0.049452572 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9eac99705920dc2e8ebccfa393efb0abec1161b2192dfd704ad9842b2d17956c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:21 localhost podman[322382]: 2026-02-01 09:58:21.011732667 +0000 UTC m=+0.161127896 container init 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:58:21 localhost dnsmasq[322414]: started, version 2.85 cachesize 150 Feb 1 04:58:21 localhost dnsmasq[322414]: DNS service limited to local subnets Feb 1 04:58:21 localhost dnsmasq[322414]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:21 localhost dnsmasq[322414]: warning: no upstream servers configured Feb 1 04:58:21 localhost dnsmasq-dhcp[322414]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:58:21 localhost dnsmasq[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/addn_hosts - 0 addresses Feb 1 04:58:21 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/host Feb 1 04:58:21 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/opts Feb 1 04:58:21 localhost podman[322382]: 2026-02-01 09:58:21.072687972 +0000 UTC m=+0.222083201 container start 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 1 04:58:21 localhost systemd[1]: tmp-crun.cMD6R0.mount: Deactivated successfully. Feb 1 04:58:21 localhost dnsmasq[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/addn_hosts - 1 addresses Feb 1 04:58:21 localhost podman[322418]: 2026-02-01 09:58:21.126664572 +0000 UTC m=+0.069204999 container kill f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:21 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/host Feb 1 04:58:21 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/opts Feb 1 04:58:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:21.223 259320 INFO neutron.agent.dhcp.agent [None req-d821884d-9cdf-4e12-bf45-52ca3fbbcec3 - - - - - -] DHCP configuration for ports {'267e2ca3-0e91-44ed-a966-125db3001651'} is completed#033[00m Feb 1 04:58:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:21.420 259320 INFO neutron.agent.dhcp.agent [None req-46bc6be9-cf97-4d07-b2bb-8df666d6d9d3 - - - - - -] DHCP configuration for ports {'1ca3d906-4fb0-4890-85d2-1327968978e8'} is completed#033[00m Feb 1 04:58:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e183 do_prune osdmap full prune enabled Feb 1 04:58:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e184 e184: 6 total, 6 up, 6 in Feb 1 04:58:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e184: 6 total, 6 up, 6 in Feb 1 04:58:22 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:22.597 2 INFO neutron.agent.securitygroups_rpc [None req-4dc119ae-7dfd-4794-add0-4c162f41d887 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:22 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:22.602 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:22Z, description=, device_id=a8a08e8b-2551-43c9-96bf-919f189d818d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=85188534-1d0d-4c80-baf9-5843d241ca0b, ip_allocation=immediate, mac_address=fa:16:3e:de:7f:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:17Z, description=, dns_domain=, id=75d12671-b626-4593-bbf3-e0e6d78cb68e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-833424952, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48917, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2455, status=ACTIVE, subnets=['49580449-97b3-4489-868a-d4f6b42a6a61'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:18Z, vlan_transparent=None, network_id=75d12671-b626-4593-bbf3-e0e6d78cb68e, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2481, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:22Z on network 75d12671-b626-4593-bbf3-e0e6d78cb68e#033[00m Feb 1 04:58:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e184 do_prune osdmap full prune enabled Feb 1 04:58:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e185 e185: 6 total, 6 up, 6 in Feb 1 04:58:22 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e185: 6 total, 6 up, 6 in Feb 1 04:58:22 localhost systemd[1]: tmp-crun.WTHhWs.mount: Deactivated successfully. Feb 1 04:58:22 localhost dnsmasq[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/addn_hosts - 1 addresses Feb 1 04:58:22 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/host Feb 1 04:58:22 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/opts Feb 1 04:58:22 localhost podman[322470]: 2026-02-01 09:58:22.838649579 +0000 UTC m=+0.061504843 container kill 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:22 localhost podman[322478]: 2026-02-01 09:58:22.851910606 +0000 UTC m=+0.050501854 container kill f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:58:22 localhost dnsmasq[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/addn_hosts - 0 addresses Feb 1 04:58:22 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/host Feb 1 04:58:22 localhost dnsmasq-dhcp[322281]: read /var/lib/neutron/dhcp/33888865-dbfe-4079-ba91-4b4c7cf03f1e/opts Feb 1 04:58:23 localhost ovn_controller[152492]: 2026-02-01T09:58:23Z|00406|binding|INFO|Releasing lport 512991a4-a1f2-414a-8f9c-b8d2ef4dc84e from this chassis (sb_readonly=0) Feb 1 04:58:23 localhost ovn_controller[152492]: 2026-02-01T09:58:23Z|00407|binding|INFO|Setting lport 512991a4-a1f2-414a-8f9c-b8d2ef4dc84e down in Southbound Feb 1 04:58:23 localhost kernel: device tap512991a4-a1 left promiscuous mode Feb 1 04:58:23 localhost nova_compute[274651]: 2026-02-01 09:58:23.110 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:23 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:23.122 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-33888865-dbfe-4079-ba91-4b4c7cf03f1e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33888865-dbfe-4079-ba91-4b4c7cf03f1e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=25e36a02-c310-4fbc-a1ea-afe6b40009b3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=512991a4-a1f2-414a-8f9c-b8d2ef4dc84e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:23 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:23.124 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 512991a4-a1f2-414a-8f9c-b8d2ef4dc84e in datapath 33888865-dbfe-4079-ba91-4b4c7cf03f1e unbound from our chassis#033[00m Feb 1 04:58:23 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:23.126 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33888865-dbfe-4079-ba91-4b4c7cf03f1e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:23 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:23.128 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[0c07f19f-6d8f-437b-980e-a6bc9ef3d84f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:23 localhost nova_compute[274651]: 2026-02-01 09:58:23.131 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:23 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:23.190 259320 INFO neutron.agent.dhcp.agent [None req-bb4f3a65-59b9-4eb5-88c7-94f7100e718b - - - - - -] DHCP configuration for ports {'85188534-1d0d-4c80-baf9-5843d241ca0b'} is completed#033[00m Feb 1 04:58:23 localhost podman[236886]: time="2026-02-01T09:58:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:58:23 localhost podman[236886]: @ - - [01/Feb/2026:09:58:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160185 "" "Go-http-client/1.1" Feb 1 04:58:24 localhost podman[236886]: @ - - [01/Feb/2026:09:58:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19784 "" "Go-http-client/1.1" Feb 1 04:58:24 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:24.351 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:22Z, description=, device_id=a8a08e8b-2551-43c9-96bf-919f189d818d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=85188534-1d0d-4c80-baf9-5843d241ca0b, ip_allocation=immediate, mac_address=fa:16:3e:de:7f:4a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:17Z, description=, dns_domain=, id=75d12671-b626-4593-bbf3-e0e6d78cb68e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-833424952, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48917, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2455, status=ACTIVE, subnets=['49580449-97b3-4489-868a-d4f6b42a6a61'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:18Z, vlan_transparent=None, network_id=75d12671-b626-4593-bbf3-e0e6d78cb68e, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2481, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:22Z on network 75d12671-b626-4593-bbf3-e0e6d78cb68e#033[00m Feb 1 04:58:24 localhost systemd[1]: tmp-crun.bmYuit.mount: Deactivated successfully. Feb 1 04:58:24 localhost dnsmasq[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/addn_hosts - 1 addresses Feb 1 04:58:24 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/host Feb 1 04:58:24 localhost podman[322546]: 2026-02-01 09:58:24.580336918 +0000 UTC m=+0.075977518 container kill 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:24 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/opts Feb 1 04:58:24 localhost dnsmasq[322281]: exiting on receipt of SIGTERM Feb 1 04:58:24 localhost podman[322560]: 2026-02-01 09:58:24.637818736 +0000 UTC m=+0.079002210 container kill f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:58:24 localhost systemd[1]: libpod-f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15.scope: Deactivated successfully. Feb 1 04:58:24 localhost podman[322580]: 2026-02-01 09:58:24.715543697 +0000 UTC m=+0.056483449 container died f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:58:24 localhost podman[322580]: 2026-02-01 09:58:24.759513339 +0000 UTC m=+0.100453061 container remove f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33888865-dbfe-4079-ba91-4b4c7cf03f1e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 04:58:24 localhost systemd[1]: libpod-conmon-f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15.scope: Deactivated successfully. Feb 1 04:58:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e185 do_prune osdmap full prune enabled Feb 1 04:58:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e186 e186: 6 total, 6 up, 6 in Feb 1 04:58:24 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e186: 6 total, 6 up, 6 in Feb 1 04:58:24 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:24.920 259320 INFO neutron.agent.dhcp.agent [None req-7c52fc4a-e7f3-4fb1-a3e7-065fb5a64790 - - - - - -] DHCP configuration for ports {'85188534-1d0d-4c80-baf9-5843d241ca0b'} is completed#033[00m Feb 1 04:58:25 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:25.132 259320 INFO neutron.agent.dhcp.agent [None req-75dc25c4-36fa-4bdd-bdaa-5aa22dc645e3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:25 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:25.133 259320 INFO neutron.agent.dhcp.agent [None req-75dc25c4-36fa-4bdd-bdaa-5aa22dc645e3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:58:25 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:25.439 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:25 localhost podman[322610]: 2026-02-01 09:58:25.473688095 +0000 UTC m=+0.085047697 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 1 04:58:25 localhost podman[322610]: 2026-02-01 09:58:25.492533725 +0000 UTC m=+0.103893307 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:25 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:58:25 localhost systemd[1]: tmp-crun.NoqMv4.mount: Deactivated successfully. Feb 1 04:58:25 localhost systemd[1]: var-lib-containers-storage-overlay-84a726aca79861c1874f2f31635f79b4ec4e5b993185268b98342eaabaf2445a-merged.mount: Deactivated successfully. Feb 1 04:58:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f2329ab5ae467c9894bdbc5a66ac4231f903552d55bcdd714be6b3da2084ed15-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:25 localhost systemd[1]: run-netns-qdhcp\x2d33888865\x2ddbfe\x2d4079\x2dba91\x2d4b4c7cf03f1e.mount: Deactivated successfully. Feb 1 04:58:25 localhost nova_compute[274651]: 2026-02-01 09:58:25.617 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:25 localhost ovn_controller[152492]: 2026-02-01T09:58:25Z|00408|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e186 do_prune osdmap full prune enabled Feb 1 04:58:25 localhost nova_compute[274651]: 2026-02-01 09:58:25.804 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e187 e187: 6 total, 6 up, 6 in Feb 1 04:58:25 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e187: 6 total, 6 up, 6 in Feb 1 04:58:26 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:26.399 2 INFO neutron.agent.securitygroups_rpc [None req-b01da263-4ebd-4a0d-81ec-4ece56b6a941 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e187 do_prune osdmap full prune enabled Feb 1 04:58:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e188 e188: 6 total, 6 up, 6 in Feb 1 04:58:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e188: 6 total, 6 up, 6 in Feb 1 04:58:27 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:27.355 2 INFO neutron.agent.securitygroups_rpc [None req-6b687509-4ccb-4206-8762-0407780a338c d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:27 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:27.390 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:27 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:27.407 259320 INFO neutron.agent.linux.ip_lib [None req-cdc037f8-375f-457e-9a1b-13d84982f50f - - - - - -] Device tapae5f52d1-77 cannot be used as it has no MAC address#033[00m Feb 1 04:58:27 localhost nova_compute[274651]: 2026-02-01 09:58:27.439 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:27 localhost kernel: device tapae5f52d1-77 entered promiscuous mode Feb 1 04:58:27 localhost NetworkManager[5964]: [1769939907.4465] manager: (tapae5f52d1-77): new Generic device (/org/freedesktop/NetworkManager/Devices/70) Feb 1 04:58:27 localhost ovn_controller[152492]: 2026-02-01T09:58:27Z|00409|binding|INFO|Claiming lport ae5f52d1-7777-4ecc-9292-53de2abde06b for this chassis. Feb 1 04:58:27 localhost systemd-udevd[322638]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:27 localhost ovn_controller[152492]: 2026-02-01T09:58:27Z|00410|binding|INFO|ae5f52d1-7777-4ecc-9292-53de2abde06b: Claiming unknown Feb 1 04:58:27 localhost nova_compute[274651]: 2026-02-01 09:58:27.451 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:27.469 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-88572dd7-5fdb-4f6c-9282-3f1605b07176', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88572dd7-5fdb-4f6c-9282-3f1605b07176', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64ed0b11-edb8-4792-9d9c-729b1ce88a71, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ae5f52d1-7777-4ecc-9292-53de2abde06b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:27.471 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ae5f52d1-7777-4ecc-9292-53de2abde06b in datapath 88572dd7-5fdb-4f6c-9282-3f1605b07176 bound to our chassis#033[00m Feb 1 04:58:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:27.472 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 88572dd7-5fdb-4f6c-9282-3f1605b07176 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:27 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:27.473 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[d053d59d-65cb-481c-b478-d44027d7ef53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:27 localhost nova_compute[274651]: 2026-02-01 09:58:27.488 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:27 localhost ovn_controller[152492]: 2026-02-01T09:58:27Z|00411|binding|INFO|Setting lport ae5f52d1-7777-4ecc-9292-53de2abde06b ovn-installed in OVS Feb 1 04:58:27 localhost ovn_controller[152492]: 2026-02-01T09:58:27Z|00412|binding|INFO|Setting lport ae5f52d1-7777-4ecc-9292-53de2abde06b up in Southbound Feb 1 04:58:27 localhost nova_compute[274651]: 2026-02-01 09:58:27.494 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:27 localhost nova_compute[274651]: 2026-02-01 09:58:27.496 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:27 localhost nova_compute[274651]: 2026-02-01 09:58:27.536 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:27 localhost nova_compute[274651]: 2026-02-01 09:58:27.567 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:27 localhost dnsmasq[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/addn_hosts - 0 addresses Feb 1 04:58:27 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/host Feb 1 04:58:27 localhost dnsmasq-dhcp[322414]: read /var/lib/neutron/dhcp/75d12671-b626-4593-bbf3-e0e6d78cb68e/opts Feb 1 04:58:27 localhost podman[322668]: 2026-02-01 09:58:27.726820194 +0000 UTC m=+0.046925263 container kill 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:58:27 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:27.852 2 INFO neutron.agent.securitygroups_rpc [None req-8578e386-0bfd-4f36-a9ea-ecec11376e1e d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:28 localhost nova_compute[274651]: 2026-02-01 09:58:28.026 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:28 localhost ovn_controller[152492]: 2026-02-01T09:58:28Z|00413|binding|INFO|Releasing lport 9cffcb83-bd08-487f-bdb0-c3f67dc8f850 from this chassis (sb_readonly=0) Feb 1 04:58:28 localhost ovn_controller[152492]: 2026-02-01T09:58:28Z|00414|binding|INFO|Setting lport 9cffcb83-bd08-487f-bdb0-c3f67dc8f850 down in Southbound Feb 1 04:58:28 localhost kernel: device tap9cffcb83-bd left promiscuous mode Feb 1 04:58:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:28.037 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-75d12671-b626-4593-bbf3-e0e6d78cb68e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-75d12671-b626-4593-bbf3-e0e6d78cb68e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c4191b-8d66-4b0b-81a5-876a8bdd10e7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9cffcb83-bd08-487f-bdb0-c3f67dc8f850) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:28.039 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 9cffcb83-bd08-487f-bdb0-c3f67dc8f850 in datapath 75d12671-b626-4593-bbf3-e0e6d78cb68e unbound from our chassis#033[00m Feb 1 04:58:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:28.042 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 75d12671-b626-4593-bbf3-e0e6d78cb68e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:28.043 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[8a23849b-271b-494f-b8a2-9cf366e39f27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:28 localhost nova_compute[274651]: 2026-02-01 09:58:28.048 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e188 do_prune osdmap full prune enabled Feb 1 04:58:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e189 e189: 6 total, 6 up, 6 in Feb 1 04:58:28 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e189: 6 total, 6 up, 6 in Feb 1 04:58:28 localhost podman[322731]: Feb 1 04:58:28 localhost podman[322731]: 2026-02-01 09:58:28.596240186 +0000 UTC m=+0.084537381 container create 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:28 localhost systemd[1]: Started libpod-conmon-0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a.scope. Feb 1 04:58:28 localhost systemd[1]: Started libcrun container. Feb 1 04:58:28 localhost podman[322731]: 2026-02-01 09:58:28.555636087 +0000 UTC m=+0.043933312 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb629f11489785a2537ca369a0e62b4334e75e871708469c9430f8481c27c6ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:28 localhost podman[322731]: 2026-02-01 09:58:28.667425925 +0000 UTC m=+0.155723110 container init 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:58:28 localhost podman[322731]: 2026-02-01 09:58:28.67406518 +0000 UTC m=+0.162362365 container start 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 04:58:28 localhost dnsmasq[322749]: started, version 2.85 cachesize 150 Feb 1 04:58:28 localhost dnsmasq[322749]: DNS service limited to local subnets Feb 1 04:58:28 localhost dnsmasq[322749]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:28 localhost dnsmasq[322749]: warning: no upstream servers configured Feb 1 04:58:28 localhost dnsmasq-dhcp[322749]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:58:28 localhost dnsmasq[322749]: read /var/lib/neutron/dhcp/88572dd7-5fdb-4f6c-9282-3f1605b07176/addn_hosts - 0 addresses Feb 1 04:58:28 localhost dnsmasq-dhcp[322749]: read /var/lib/neutron/dhcp/88572dd7-5fdb-4f6c-9282-3f1605b07176/host Feb 1 04:58:28 localhost dnsmasq-dhcp[322749]: read /var/lib/neutron/dhcp/88572dd7-5fdb-4f6c-9282-3f1605b07176/opts Feb 1 04:58:28 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:28.788 2 INFO neutron.agent.securitygroups_rpc [None req-c10ec1b0-3019-415a-9281-06c26de3609b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:28.826 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:28.835 259320 INFO neutron.agent.dhcp.agent [None req-b55783df-f812-46d3-a082-025724ce66ca - - - - - -] DHCP configuration for ports {'e9089d4e-c9d4-47c1-a3f4-62bb78fb488d'} is completed#033[00m Feb 1 04:58:28 localhost dnsmasq[322414]: exiting on receipt of SIGTERM Feb 1 04:58:28 localhost podman[322767]: 2026-02-01 09:58:28.856816441 +0000 UTC m=+0.046963496 container kill 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:58:28 localhost systemd[1]: libpod-45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39.scope: Deactivated successfully. Feb 1 04:58:28 localhost podman[322780]: 2026-02-01 09:58:28.915940859 +0000 UTC m=+0.044862871 container died 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:58:28 localhost podman[322780]: 2026-02-01 09:58:28.946277412 +0000 UTC m=+0.075199404 container cleanup 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:28 localhost systemd[1]: libpod-conmon-45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39.scope: Deactivated successfully. Feb 1 04:58:29 localhost podman[322782]: 2026-02-01 09:58:29.002147701 +0000 UTC m=+0.125038128 container remove 45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-75d12671-b626-4593-bbf3-e0e6d78cb68e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:58:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:29.205 259320 INFO neutron.agent.dhcp.agent [None req-6a7afacb-1ead-4e2a-a996-c8229ffb9c70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:29.275 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:29 localhost systemd[1]: var-lib-containers-storage-overlay-9eac99705920dc2e8ebccfa393efb0abec1161b2192dfd704ad9842b2d17956c-merged.mount: Deactivated successfully. Feb 1 04:58:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45fcefba934a31e34f1d5c444463acd9d7b5883e9f28d388dcb4bf6bd6f5fb39-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:29 localhost systemd[1]: run-netns-qdhcp\x2d75d12671\x2db626\x2d4593\x2dbbf3\x2de0e6d78cb68e.mount: Deactivated successfully. Feb 1 04:58:29 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:29.636 2 INFO neutron.agent.securitygroups_rpc [None req-53541160-40c7-461f-a788-c0d63f1c152b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:29 localhost ovn_controller[152492]: 2026-02-01T09:58:29Z|00415|binding|INFO|Removing iface tapae5f52d1-77 ovn-installed in OVS Feb 1 04:58:29 localhost ovn_controller[152492]: 2026-02-01T09:58:29Z|00416|binding|INFO|Removing lport ae5f52d1-7777-4ecc-9292-53de2abde06b ovn-installed in OVS Feb 1 04:58:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:29.976 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 9b34e989-e917-46e2-8354-24f8a08a3a40 with type ""#033[00m Feb 1 04:58:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:29.977 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-88572dd7-5fdb-4f6c-9282-3f1605b07176', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88572dd7-5fdb-4f6c-9282-3f1605b07176', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64ed0b11-edb8-4792-9d9c-729b1ce88a71, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ae5f52d1-7777-4ecc-9292-53de2abde06b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:29 localhost nova_compute[274651]: 2026-02-01 09:58:29.978 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:29.980 158365 INFO neutron.agent.ovn.metadata.agent [-] Port ae5f52d1-7777-4ecc-9292-53de2abde06b in datapath 88572dd7-5fdb-4f6c-9282-3f1605b07176 unbound from our chassis#033[00m Feb 1 04:58:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:29.982 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88572dd7-5fdb-4f6c-9282-3f1605b07176, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:29 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:29.983 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0d0b8c-6f36-41c3-9b9c-9cd31c8b6bd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:29 localhost nova_compute[274651]: 2026-02-01 09:58:29.984 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:29 localhost nova_compute[274651]: 2026-02-01 09:58:29.996 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:29 localhost kernel: device tapae5f52d1-77 left promiscuous mode Feb 1 04:58:30 localhost nova_compute[274651]: 2026-02-01 09:58:30.011 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.042 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:58:30 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:58:30 localhost ovn_controller[152492]: 2026-02-01T09:58:30Z|00417|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:30 localhost podman[322826]: 2026-02-01 09:58:30.431312238 +0000 UTC m=+0.048232505 container kill 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:30 localhost dnsmasq[322749]: read /var/lib/neutron/dhcp/88572dd7-5fdb-4f6c-9282-3f1605b07176/addn_hosts - 0 addresses Feb 1 04:58:30 localhost dnsmasq-dhcp[322749]: read /var/lib/neutron/dhcp/88572dd7-5fdb-4f6c-9282-3f1605b07176/host Feb 1 04:58:30 localhost dnsmasq-dhcp[322749]: read /var/lib/neutron/dhcp/88572dd7-5fdb-4f6c-9282-3f1605b07176/opts Feb 1 04:58:30 localhost nova_compute[274651]: 2026-02-01 09:58:30.445 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent [None req-12fe4a02-f7c1-4ecb-b7ee-f9d35a13034c - - - - - -] Unable to reload_allocations dhcp for 88572dd7-5fdb-4f6c-9282-3f1605b07176.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapae5f52d1-77 not found in namespace qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176. Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapae5f52d1-77 not found in namespace qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176. Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.458 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.464 259320 INFO neutron.agent.dhcp.agent [None req-47580ec0-0079-4dad-8479-09f34f96e55e - - - - - -] Synchronizing state#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.653 259320 INFO neutron.agent.dhcp.agent [None req-4ebb1c37-0bc2-4546-8a27-66344f59e1bc - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.653 259320 INFO neutron.agent.dhcp.agent [-] Starting network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f dhcp configuration#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.654 259320 INFO neutron.agent.dhcp.agent [-] Finished network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f dhcp configuration#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.654 259320 INFO neutron.agent.dhcp.agent [-] Starting network 88572dd7-5fdb-4f6c-9282-3f1605b07176 dhcp configuration#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.654 259320 INFO neutron.agent.dhcp.agent [-] Finished network 88572dd7-5fdb-4f6c-9282-3f1605b07176 dhcp configuration#033[00m Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.655 259320 INFO neutron.agent.dhcp.agent [None req-4ebb1c37-0bc2-4546-8a27-66344f59e1bc - - - - - -] Synchronizing state complete#033[00m Feb 1 04:58:30 localhost nova_compute[274651]: 2026-02-01 09:58:30.655 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:30 localhost dnsmasq[322749]: exiting on receipt of SIGTERM Feb 1 04:58:30 localhost podman[322856]: 2026-02-01 09:58:30.828015958 +0000 UTC m=+0.059931713 container kill 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:58:30 localhost systemd[1]: libpod-0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a.scope: Deactivated successfully. Feb 1 04:58:30 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:30.842 2 INFO neutron.agent.securitygroups_rpc [None req-7e2c05f6-614a-4970-b06a-02bbc809b5c5 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:30 localhost podman[322869]: 2026-02-01 09:58:30.903137209 +0000 UTC m=+0.057415036 container died 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:30 localhost systemd[1]: tmp-crun.G9nlrw.mount: Deactivated successfully. Feb 1 04:58:30 localhost podman[322869]: 2026-02-01 09:58:30.940403506 +0000 UTC m=+0.094680803 container cleanup 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 04:58:30 localhost systemd[1]: libpod-conmon-0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a.scope: Deactivated successfully. Feb 1 04:58:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:30.981 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:30 localhost podman[322870]: 2026-02-01 09:58:30.983316936 +0000 UTC m=+0.134462987 container remove 0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-88572dd7-5fdb-4f6c-9282-3f1605b07176, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:58:31 localhost openstack_network_exporter[239441]: ERROR 09:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:58:31 localhost openstack_network_exporter[239441]: Feb 1 04:58:31 localhost openstack_network_exporter[239441]: ERROR 09:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:58:31 localhost openstack_network_exporter[239441]: Feb 1 04:58:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e189 do_prune osdmap full prune enabled Feb 1 04:58:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e190 e190: 6 total, 6 up, 6 in Feb 1 04:58:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e190: 6 total, 6 up, 6 in Feb 1 04:58:31 localhost systemd[1]: var-lib-containers-storage-overlay-cb629f11489785a2537ca369a0e62b4334e75e871708469c9430f8481c27c6ea-merged.mount: Deactivated successfully. Feb 1 04:58:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0abbd6b2ace71427d3881ab34cbb6ab40256281ad60a5212be1dc262520aba2a-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:31 localhost systemd[1]: run-netns-qdhcp\x2d88572dd7\x2d5fdb\x2d4f6c\x2d9282\x2d3f1605b07176.mount: Deactivated successfully. Feb 1 04:58:32 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:32.245 2 INFO neutron.agent.securitygroups_rpc [None req-29619353-dbda-49ae-a09a-12a5be7ce5b3 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['179c1cf2-2f2b-4c28-9577-447c415ef292']#033[00m Feb 1 04:58:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:58:33 localhost podman[322897]: 2026-02-01 09:58:33.729621884 +0000 UTC m=+0.085538452 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:58:33 localhost podman[322897]: 2026-02-01 09:58:33.743384887 +0000 UTC m=+0.099301425 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:58:33 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:58:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:58:33 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:58:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:33.922 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:33 localhost nova_compute[274651]: 2026-02-01 09:58:33.922 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:33.923 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:58:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:34.772 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff54b909-b3b9-4669-8851-459606a86b19) old=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:34.775 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff54b909-b3b9-4669-8851-459606a86b19 in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 updated#033[00m Feb 1 04:58:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:34.778 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:34 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:34.779 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f3577d5e-8a8c-4511-92fa-c2ae15f640d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:35 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:35.403 2 INFO neutron.agent.securitygroups_rpc [None req-dcb1a11b-934a-4ada-b696-4f65471e82a7 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:35 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:35.433 259320 INFO neutron.agent.linux.ip_lib [None req-44ccfd83-2a06-4b8f-9ee9-819108fcab3c - - - - - -] Device tap467a2044-12 cannot be used as it has no MAC address#033[00m Feb 1 04:58:35 localhost nova_compute[274651]: 2026-02-01 09:58:35.463 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:35 localhost kernel: device tap467a2044-12 entered promiscuous mode Feb 1 04:58:35 localhost nova_compute[274651]: 2026-02-01 09:58:35.472 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:35 localhost NetworkManager[5964]: [1769939915.4748] manager: (tap467a2044-12): new Generic device (/org/freedesktop/NetworkManager/Devices/71) Feb 1 04:58:35 localhost ovn_controller[152492]: 2026-02-01T09:58:35Z|00418|binding|INFO|Claiming lport 467a2044-1273-4e6f-a459-60513fa8b688 for this chassis. Feb 1 04:58:35 localhost ovn_controller[152492]: 2026-02-01T09:58:35Z|00419|binding|INFO|467a2044-1273-4e6f-a459-60513fa8b688: Claiming unknown Feb 1 04:58:35 localhost systemd-udevd[322929]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:35.488 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f0b901-aef5-455a-bf56-ef7db3227583, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=467a2044-1273-4e6f-a459-60513fa8b688) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:35.490 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 467a2044-1273-4e6f-a459-60513fa8b688 in datapath ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a bound to our chassis#033[00m Feb 1 04:58:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:35.495 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 206c88e3-e1a5-42ad-a58d-281082fa1fb3 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:58:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:35.496 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:35 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:35.497 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[efbac2c7-b29c-47fe-88c6-e8f4d13ed0f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost ovn_controller[152492]: 2026-02-01T09:58:35Z|00420|binding|INFO|Setting lport 467a2044-1273-4e6f-a459-60513fa8b688 ovn-installed in OVS Feb 1 04:58:35 localhost ovn_controller[152492]: 2026-02-01T09:58:35Z|00421|binding|INFO|Setting lport 467a2044-1273-4e6f-a459-60513fa8b688 up in Southbound Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost nova_compute[274651]: 2026-02-01 09:58:35.510 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost journal[217584]: ethtool ioctl error on tap467a2044-12: No such device Feb 1 04:58:35 localhost nova_compute[274651]: 2026-02-01 09:58:35.611 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:35 localhost nova_compute[274651]: 2026-02-01 09:58:35.634 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:35 localhost nova_compute[274651]: 2026-02-01 09:58:35.660 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e190 do_prune osdmap full prune enabled Feb 1 04:58:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e191 e191: 6 total, 6 up, 6 in Feb 1 04:58:35 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e191: 6 total, 6 up, 6 in Feb 1 04:58:35 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:35.861 2 INFO neutron.agent.securitygroups_rpc [None req-76985e45-05b5-4e3d-9a6f-ec3725c3f7a9 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['179c1cf2-2f2b-4c28-9577-447c415ef292', 'c3da8dd5-026e-4efc-b2ae-7f08a7679dbe']#033[00m Feb 1 04:58:35 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:35.867 2 INFO neutron.agent.securitygroups_rpc [None req-f5224dd1-77ce-42ca-881a-3d11d7bd93a9 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:36 localhost podman[323000]: Feb 1 04:58:36 localhost podman[323000]: 2026-02-01 09:58:36.559667078 +0000 UTC m=+0.086365006 container create d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:58:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:58:36 localhost systemd[1]: Started libpod-conmon-d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59.scope. Feb 1 04:58:36 localhost systemd[1]: Started libcrun container. Feb 1 04:58:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/744ff171323d0a729cd92183d43c5c80ae24fc9b8f7475f722dd94deff9f45e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:36 localhost podman[323000]: 2026-02-01 09:58:36.519484702 +0000 UTC m=+0.046182650 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:36 localhost podman[323000]: 2026-02-01 09:58:36.621399818 +0000 UTC m=+0.148097756 container init d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:58:36 localhost podman[323000]: 2026-02-01 09:58:36.63057314 +0000 UTC m=+0.157271078 container start d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:36 localhost dnsmasq[323026]: started, version 2.85 cachesize 150 Feb 1 04:58:36 localhost dnsmasq[323026]: DNS service limited to local subnets Feb 1 04:58:36 localhost dnsmasq[323026]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:36 localhost dnsmasq[323026]: warning: no upstream servers configured Feb 1 04:58:36 localhost dnsmasq-dhcp[323026]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:58:36 localhost dnsmasq[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/addn_hosts - 0 addresses Feb 1 04:58:36 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/host Feb 1 04:58:36 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/opts Feb 1 04:58:36 localhost podman[323014]: 2026-02-01 09:58:36.693100462 +0000 UTC m=+0.093499166 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:58:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:36.700 259320 INFO neutron.agent.dhcp.agent [None req-bedca48a-e3ee-4d15-9f8f-4ad49299a12a - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:35Z, description=, device_id=0d3b238e-083b-4f7c-8e29-650b41019987, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5739cb7f-6dd6-4fda-a4a9-64adbd74224d, ip_allocation=immediate, mac_address=fa:16:3e:d5:ee:11, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:30Z, description=, dns_domain=, id=ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1623608902, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34898, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2515, status=ACTIVE, subnets=['c64e0c95-d220-4c03-bb62-d5a0c42c47c0'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:33Z, vlan_transparent=None, network_id=ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2544, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:35Z on network ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a#033[00m Feb 1 04:58:36 localhost podman[323014]: 2026-02-01 09:58:36.73042782 +0000 UTC m=+0.130826494 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 1 04:58:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e191 do_prune osdmap full prune enabled Feb 1 04:58:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e192 e192: 6 total, 6 up, 6 in Feb 1 04:58:36 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:58:36 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e192: 6 total, 6 up, 6 in Feb 1 04:58:36 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:36.831 259320 INFO neutron.agent.dhcp.agent [None req-1ec2c175-5380-49af-a084-92c10581047e - - - - - -] DHCP configuration for ports {'e1f52dc9-171d-416a-a945-3009dce94ba4'} is completed#033[00m Feb 1 04:58:36 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:36.934 2 INFO neutron.agent.securitygroups_rpc [None req-1e2fb4b2-ba87-4ef5-8fe2-a6ebe885c08a 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['c3da8dd5-026e-4efc-b2ae-7f08a7679dbe']#033[00m Feb 1 04:58:36 localhost dnsmasq[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/addn_hosts - 1 addresses Feb 1 04:58:36 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/host Feb 1 04:58:36 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/opts Feb 1 04:58:36 localhost podman[323054]: 2026-02-01 09:58:36.938691846 +0000 UTC m=+0.068815968 container kill d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 04:58:37 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:37.199 259320 INFO neutron.agent.dhcp.agent [None req-91455335-b10b-4bf9-a980-6692f8c4f9d5 - - - - - -] DHCP configuration for ports {'5739cb7f-6dd6-4fda-a4a9-64adbd74224d'} is completed#033[00m Feb 1 04:58:37 localhost systemd[1]: tmp-crun.owln2V.mount: Deactivated successfully. Feb 1 04:58:37 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:37.590 2 INFO neutron.agent.securitygroups_rpc [None req-9cffbc9d-d1cb-4c4b-934e-47effaf28296 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:37 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:37.925 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:58:38 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:38.211 2 INFO neutron.agent.securitygroups_rpc [None req-901cd980-7a05-470c-b831-e021c0b0d3ee d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:38.239 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:35Z, description=, device_id=0d3b238e-083b-4f7c-8e29-650b41019987, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5739cb7f-6dd6-4fda-a4a9-64adbd74224d, ip_allocation=immediate, mac_address=fa:16:3e:d5:ee:11, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:30Z, description=, dns_domain=, id=ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1623608902, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34898, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2515, status=ACTIVE, subnets=['c64e0c95-d220-4c03-bb62-d5a0c42c47c0'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:33Z, vlan_transparent=None, network_id=ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2544, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:35Z on network ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a#033[00m Feb 1 04:58:38 localhost dnsmasq[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/addn_hosts - 1 addresses Feb 1 04:58:38 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/host Feb 1 04:58:38 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/opts Feb 1 04:58:38 localhost podman[323092]: 2026-02-01 09:58:38.461448442 +0000 UTC m=+0.067137666 container kill d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:58:38 localhost nova_compute[274651]: 2026-02-01 09:58:38.821 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e192 do_prune osdmap full prune enabled Feb 1 04:58:38 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:38.879 259320 INFO neutron.agent.dhcp.agent [None req-e3b73edf-4f45-403f-bdc8-f0cf4d6a0dc2 - - - - - -] DHCP configuration for ports {'5739cb7f-6dd6-4fda-a4a9-64adbd74224d'} is completed#033[00m Feb 1 04:58:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e193 e193: 6 total, 6 up, 6 in Feb 1 04:58:38 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e193: 6 total, 6 up, 6 in Feb 1 04:58:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e193 do_prune osdmap full prune enabled Feb 1 04:58:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e194 e194: 6 total, 6 up, 6 in Feb 1 04:58:39 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e194: 6 total, 6 up, 6 in Feb 1 04:58:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:58:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.016 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff54b909-b3b9-4669-8851-459606a86b19) old=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.019 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff54b909-b3b9-4669-8851-459606a86b19 in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 updated#033[00m Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.022 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.024 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[8643bb1a-5db7-4381-bb51-49927dd0da41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:40 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:40.240 259320 INFO neutron.agent.linux.ip_lib [None req-c4bb9409-de04-4414-b6b9-6d68c2f62ab2 - - - - - -] Device tap648e1961-9f cannot be used as it has no MAC address#033[00m Feb 1 04:58:40 localhost nova_compute[274651]: 2026-02-01 09:58:40.267 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:40 localhost kernel: device tap648e1961-9f entered promiscuous mode Feb 1 04:58:40 localhost NetworkManager[5964]: [1769939920.2794] manager: (tap648e1961-9f): new Generic device (/org/freedesktop/NetworkManager/Devices/72) Feb 1 04:58:40 localhost nova_compute[274651]: 2026-02-01 09:58:40.282 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:40 localhost ovn_controller[152492]: 2026-02-01T09:58:40Z|00422|binding|INFO|Claiming lport 648e1961-9f78-4c28-86a3-0554c4b51949 for this chassis. Feb 1 04:58:40 localhost ovn_controller[152492]: 2026-02-01T09:58:40Z|00423|binding|INFO|648e1961-9f78-4c28-86a3-0554c4b51949: Claiming unknown Feb 1 04:58:40 localhost systemd-udevd[323208]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.304 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ff40dee-b44e-4301-85db-f735196cc670, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=648e1961-9f78-4c28-86a3-0554c4b51949) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.306 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 648e1961-9f78-4c28-86a3-0554c4b51949 in datapath cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f bound to our chassis#033[00m Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.307 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:40 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:40.309 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ae07114f-4fab-48a7-9ae3-8949aa3e18b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost ovn_controller[152492]: 2026-02-01T09:58:40Z|00424|binding|INFO|Setting lport 648e1961-9f78-4c28-86a3-0554c4b51949 ovn-installed in OVS Feb 1 04:58:40 localhost ovn_controller[152492]: 2026-02-01T09:58:40Z|00425|binding|INFO|Setting lport 648e1961-9f78-4c28-86a3-0554c4b51949 up in Southbound Feb 1 04:58:40 localhost nova_compute[274651]: 2026-02-01 09:58:40.329 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost journal[217584]: ethtool ioctl error on tap648e1961-9f: No such device Feb 1 04:58:40 localhost nova_compute[274651]: 2026-02-01 09:58:40.371 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:40 localhost nova_compute[274651]: 2026-02-01 09:58:40.403 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:40 localhost nova_compute[274651]: 2026-02-01 09:58:40.699 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:40 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:58:40 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:58:41 localhost podman[323279]: Feb 1 04:58:41 localhost podman[323279]: 2026-02-01 09:58:41.267400386 +0000 UTC m=+0.091816456 container create 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:58:41 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:41.285 2 INFO neutron.agent.securitygroups_rpc [None req-74812456-6c53-4459-bad0-4aac2b18e077 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:58:41 localhost systemd[1]: Started libpod-conmon-6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f.scope. Feb 1 04:58:41 localhost podman[323279]: 2026-02-01 09:58:41.223477544 +0000 UTC m=+0.047893634 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:41 localhost systemd[1]: tmp-crun.fKbz5R.mount: Deactivated successfully. Feb 1 04:58:41 localhost systemd[1]: Started libcrun container. Feb 1 04:58:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/349c1f452165026f1819f14509e16042c99113d0aa39997091890b70de8742c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:41 localhost podman[323279]: 2026-02-01 09:58:41.379548225 +0000 UTC m=+0.203964295 container init 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:58:41 localhost dnsmasq[323310]: started, version 2.85 cachesize 150 Feb 1 04:58:41 localhost dnsmasq[323310]: DNS service limited to local subnets Feb 1 04:58:41 localhost dnsmasq[323310]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:41 localhost dnsmasq[323310]: warning: no upstream servers configured Feb 1 04:58:41 localhost dnsmasq-dhcp[323310]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:58:41 localhost dnsmasq[323310]: read /var/lib/neutron/dhcp/cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f/addn_hosts - 0 addresses Feb 1 04:58:41 localhost dnsmasq-dhcp[323310]: read /var/lib/neutron/dhcp/cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f/host Feb 1 04:58:41 localhost dnsmasq-dhcp[323310]: read /var/lib/neutron/dhcp/cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f/opts Feb 1 04:58:41 localhost podman[323293]: 2026-02-01 09:58:41.422746343 +0000 UTC m=+0.104384301 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 04:58:41 localhost podman[323279]: 2026-02-01 09:58:41.440489859 +0000 UTC m=+0.264905919 container start 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 04:58:41 localhost podman[323293]: 2026-02-01 09:58:41.457854603 +0000 UTC m=+0.139492571 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:58:41 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:58:41 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:41.675 259320 INFO neutron.agent.dhcp.agent [None req-0943bdb0-ce42-4f20-84de-16beffebd60d - - - - - -] DHCP configuration for ports {'e8b618d8-2cc9-442b-83df-44a4c526bb93'} is completed#033[00m Feb 1 04:58:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:58:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.720 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.721 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.722 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:58:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:41 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:41.744 259320 INFO neutron.agent.linux.ip_lib [None req-92646cdc-2116-49c2-a964-320c48ebeea3 - - - - - -] Device tapbf7d0d2e-18 cannot be used as it has no MAC address#033[00m Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.801 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost kernel: device tapbf7d0d2e-18 entered promiscuous mode Feb 1 04:58:41 localhost NetworkManager[5964]: [1769939921.8095] manager: (tapbf7d0d2e-18): new Generic device (/org/freedesktop/NetworkManager/Devices/73) Feb 1 04:58:41 localhost systemd-udevd[323210]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.811 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost ovn_controller[152492]: 2026-02-01T09:58:41Z|00426|binding|INFO|Claiming lport bf7d0d2e-188f-4e02-9010-c6fdd32fa535 for this chassis. Feb 1 04:58:41 localhost ovn_controller[152492]: 2026-02-01T09:58:41Z|00427|binding|INFO|bf7d0d2e-188f-4e02-9010-c6fdd32fa535: Claiming unknown Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.822 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-79a6b2f6-ab80-415a-900e-ecc135d5887e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79a6b2f6-ab80-415a-900e-ecc135d5887e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=98dfd2fa-bc9d-4f49-9db1-ad3da76c279d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bf7d0d2e-188f-4e02-9010-c6fdd32fa535) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.827 158365 INFO neutron.agent.ovn.metadata.agent [-] Port bf7d0d2e-188f-4e02-9010-c6fdd32fa535 in datapath 79a6b2f6-ab80-415a-900e-ecc135d5887e bound to our chassis#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.829 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 79a6b2f6-ab80-415a-900e-ecc135d5887e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.830 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a831e5-e5ae-440e-b394-790f21d35949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost ovn_controller[152492]: 2026-02-01T09:58:41Z|00428|binding|INFO|Removing iface tap648e1961-9f ovn-installed in OVS Feb 1 04:58:41 localhost ovn_controller[152492]: 2026-02-01T09:58:41Z|00429|binding|INFO|Removing lport 648e1961-9f78-4c28-86a3-0554c4b51949 ovn-installed in OVS Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.861 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.865 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.862 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 6735bdae-85b5-4f35-9c20-3315ad8ef015 with type ""#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.864 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ff40dee-b44e-4301-85db-f735196cc670, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=648e1961-9f78-4c28-86a3-0554c4b51949) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.867 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 648e1961-9f78-4c28-86a3-0554c4b51949 in datapath cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f unbound from our chassis#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.870 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:41.871 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[99a4c5c2-e7b2-483e-9077-81eeb986819e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost ovn_controller[152492]: 2026-02-01T09:58:41Z|00430|binding|INFO|Setting lport bf7d0d2e-188f-4e02-9010-c6fdd32fa535 ovn-installed in OVS Feb 1 04:58:41 localhost ovn_controller[152492]: 2026-02-01T09:58:41Z|00431|binding|INFO|Setting lport bf7d0d2e-188f-4e02-9010-c6fdd32fa535 up in Southbound Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.873 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.877 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost kernel: device tap648e1961-9f left promiscuous mode Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost journal[217584]: ethtool ioctl error on tapbf7d0d2e-18: No such device Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.891 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.907 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost nova_compute[274651]: 2026-02-01 09:58:41.935 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e194 do_prune osdmap full prune enabled Feb 1 04:58:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e195 e195: 6 total, 6 up, 6 in Feb 1 04:58:41 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e195: 6 total, 6 up, 6 in Feb 1 04:58:41 localhost podman[323335]: 2026-02-01 09:58:41.966381865 +0000 UTC m=+0.106983842 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:58:42 localhost podman[323335]: 2026-02-01 09:58:42.015603178 +0000 UTC m=+0.156205225 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller) Feb 1 04:58:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:42.025 2 INFO neutron.agent.securitygroups_rpc [None req-c40bbf53-5b24-4f24-a7f5-e7d11bb1e253 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['2fbf2c39-17e9-4e72-bbbb-e5125197536a']#033[00m Feb 1 04:58:42 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:58:42 localhost dnsmasq[323310]: read /var/lib/neutron/dhcp/cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f/addn_hosts - 0 addresses Feb 1 04:58:42 localhost podman[323408]: 2026-02-01 09:58:42.255851078 +0000 UTC m=+0.058469680 container kill 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:58:42 localhost dnsmasq-dhcp[323310]: read /var/lib/neutron/dhcp/cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f/host Feb 1 04:58:42 localhost dnsmasq-dhcp[323310]: read /var/lib/neutron/dhcp/cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f/opts Feb 1 04:58:42 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:42.263 2 INFO neutron.agent.securitygroups_rpc [None req-5822cc8d-7dc4-44de-b45f-4af6413189cc d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent [None req-3f14c530-64bf-4e53-9d28-ce9cfe3ef2ce - - - - - -] Unable to reload_allocations dhcp for cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap648e1961-9f not found in namespace qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f. Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap648e1961-9f not found in namespace qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f. Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.285 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:58:42 localhost ovn_controller[152492]: 2026-02-01T09:58:42Z|00432|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:42 localhost nova_compute[274651]: 2026-02-01 09:58:42.549 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:58:42 localhost podman[323456]: Feb 1 04:58:42 localhost podman[323456]: 2026-02-01 09:58:42.754227967 +0000 UTC m=+0.090050831 container create a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:58:42 localhost systemd[1]: Started libpod-conmon-a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67.scope. Feb 1 04:58:42 localhost systemd[1]: Started libcrun container. Feb 1 04:58:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc527cf19d6c83dc4f742fd9a16d8baf00815e9a16c01ab95e91f60ce69b3fb1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:42 localhost podman[323456]: 2026-02-01 09:58:42.710587464 +0000 UTC m=+0.046410358 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:42 localhost podman[323456]: 2026-02-01 09:58:42.819433582 +0000 UTC m=+0.155256446 container init a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:58:42 localhost podman[323456]: 2026-02-01 09:58:42.836867018 +0000 UTC m=+0.172689902 container start a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 04:58:42 localhost dnsmasq[323474]: started, version 2.85 cachesize 150 Feb 1 04:58:42 localhost dnsmasq[323474]: DNS service limited to local subnets Feb 1 04:58:42 localhost dnsmasq[323474]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:42 localhost dnsmasq[323474]: warning: no upstream servers configured Feb 1 04:58:42 localhost dnsmasq-dhcp[323474]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:58:42 localhost dnsmasq[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/addn_hosts - 0 addresses Feb 1 04:58:42 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/host Feb 1 04:58:42 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/opts Feb 1 04:58:42 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:42.899 259320 INFO neutron.agent.dhcp.agent [None req-4ebb1c37-0bc2-4546-8a27-66344f59e1bc - - - - - -] Synchronizing state#033[00m Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.113 259320 INFO neutron.agent.dhcp.agent [None req-d00fa8d1-c8eb-48cb-97c5-56d1e1e845f6 - - - - - -] DHCP configuration for ports {'d8981779-68a2-43fd-b25d-f086a62f6308'} is completed#033[00m Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.117 259320 INFO neutron.agent.dhcp.agent [None req-87ebdbdc-2519-4f5a-9a86-0a9286f993fc - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:58:43 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:43.297 2 INFO neutron.agent.securitygroups_rpc [None req-850b62dc-2adf-4cc9-b2ae-b1b62bc1910e d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:43 localhost podman[323490]: 2026-02-01 09:58:43.309103103 +0000 UTC m=+0.055041774 container kill 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:43 localhost dnsmasq[323310]: exiting on receipt of SIGTERM Feb 1 04:58:43 localhost systemd[1]: libpod-6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f.scope: Deactivated successfully. Feb 1 04:58:43 localhost podman[323504]: 2026-02-01 09:58:43.381719047 +0000 UTC m=+0.056269942 container died 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:58:43 localhost podman[323504]: 2026-02-01 09:58:43.415045932 +0000 UTC m=+0.089596777 container cleanup 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:58:43 localhost systemd[1]: libpod-conmon-6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f.scope: Deactivated successfully. Feb 1 04:58:43 localhost podman[323505]: 2026-02-01 09:58:43.460250062 +0000 UTC m=+0.127193634 container remove 6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cb91f9e5-0b36-48ac-bba0-7c9b90cd9d7f, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.485 259320 INFO neutron.agent.dhcp.agent [-] Starting network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f dhcp configuration#033[00m Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.486 259320 INFO neutron.agent.dhcp.agent [-] Finished network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f dhcp configuration#033[00m Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.488 259320 INFO neutron.agent.dhcp.agent [None req-efd7b8dc-b4a5-49e8-855b-0e9a0906d325 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.489 259320 INFO neutron.agent.dhcp.agent [None req-92646cdc-2116-49c2-a964-320c48ebeea3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:40Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8919832d-70d1-4d3e-9345-3cf5f2a2f971, ip_allocation=immediate, mac_address=fa:16:3e:cb:d2:e9, name=tempest-PortsIpV6TestJSON-405273406, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:39Z, description=, dns_domain=, id=79a6b2f6-ab80-415a-900e-ecc135d5887e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-977814798, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31988, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2570, status=ACTIVE, subnets=['a3fee14f-56d1-41e7-a16a-6b89919cd54e'], tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:40Z, vlan_transparent=None, network_id=79a6b2f6-ab80-415a-900e-ecc135d5887e, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['090b75a3-5cce-4012-8bbe-4b851ef442c2'], standard_attr_id=2587, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:41Z on network 79a6b2f6-ab80-415a-900e-ecc135d5887e#033[00m Feb 1 04:58:43 localhost dnsmasq[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/addn_hosts - 1 addresses Feb 1 04:58:43 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/host Feb 1 04:58:43 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/opts Feb 1 04:58:43 localhost podman[323548]: 2026-02-01 09:58:43.662569434 +0000 UTC m=+0.060438350 container kill a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.834 259320 INFO neutron.agent.dhcp.agent [None req-92646cdc-2116-49c2-a964-320c48ebeea3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:41Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5505ecda-7e72-4688-b53e-968bf3b6f5ff, ip_allocation=immediate, mac_address=fa:16:3e:24:f4:38, name=tempest-PortsIpV6TestJSON-1240027068, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:39Z, description=, dns_domain=, id=79a6b2f6-ab80-415a-900e-ecc135d5887e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-977814798, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31988, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2570, status=ACTIVE, subnets=['a3fee14f-56d1-41e7-a16a-6b89919cd54e'], tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:40Z, vlan_transparent=None, network_id=79a6b2f6-ab80-415a-900e-ecc135d5887e, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['090b75a3-5cce-4012-8bbe-4b851ef442c2'], standard_attr_id=2590, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:41Z on network 79a6b2f6-ab80-415a-900e-ecc135d5887e#033[00m Feb 1 04:58:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:43.953 259320 INFO neutron.agent.dhcp.agent [None req-fb562ad4-2dd9-4a7a-a847-2b21444db2eb - - - - - -] DHCP configuration for ports {'8919832d-70d1-4d3e-9345-3cf5f2a2f971'} is completed#033[00m Feb 1 04:58:44 localhost dnsmasq[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/addn_hosts - 2 addresses Feb 1 04:58:44 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/host Feb 1 04:58:44 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/opts Feb 1 04:58:44 localhost podman[323585]: 2026-02-01 09:58:44.036691932 +0000 UTC m=+0.059311546 container kill a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:58:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:44.178 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff54b909-b3b9-4669-8851-459606a86b19) old=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:44.182 158365 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff54b909-b3b9-4669-8851-459606a86b19 in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 updated#033[00m Feb 1 04:58:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:44.186 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:58:44 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:44.187 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[361fb802-0683-4524-ad10-1ef351961eb6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:44 localhost systemd[1]: var-lib-containers-storage-overlay-349c1f452165026f1819f14509e16042c99113d0aa39997091890b70de8742c8-merged.mount: Deactivated successfully. Feb 1 04:58:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6bc90628666b0224c4c7356315c3e7e1fdd197eabdec3386ded01436d6e8324f-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:44 localhost systemd[1]: run-netns-qdhcp\x2dcb91f9e5\x2d0b36\x2d48ac\x2dbba0\x2d7c9b90cd9d7f.mount: Deactivated successfully. Feb 1 04:58:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:44.288 259320 INFO neutron.agent.dhcp.agent [None req-67f6a895-747e-428a-91e4-983329dc0a1b - - - - - -] DHCP configuration for ports {'5505ecda-7e72-4688-b53e-968bf3b6f5ff'} is completed#033[00m Feb 1 04:58:44 localhost dnsmasq[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/addn_hosts - 1 addresses Feb 1 04:58:44 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/host Feb 1 04:58:44 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/opts Feb 1 04:58:44 localhost systemd[1]: tmp-crun.KXsnnx.mount: Deactivated successfully. Feb 1 04:58:44 localhost podman[323623]: 2026-02-01 09:58:44.411648384 +0000 UTC m=+0.071512590 container kill a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:58:44 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:44.422 2 INFO neutron.agent.securitygroups_rpc [None req-1da9737c-8fe2-425a-864f-b783c535a95b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:44 localhost dnsmasq[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/addn_hosts - 0 addresses Feb 1 04:58:44 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/host Feb 1 04:58:44 localhost dnsmasq-dhcp[323474]: read /var/lib/neutron/dhcp/79a6b2f6-ab80-415a-900e-ecc135d5887e/opts Feb 1 04:58:44 localhost podman[323662]: 2026-02-01 09:58:44.729614504 +0000 UTC m=+0.060978297 container kill a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:58:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e195 do_prune osdmap full prune enabled Feb 1 04:58:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e196 e196: 6 total, 6 up, 6 in Feb 1 04:58:45 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e196: 6 total, 6 up, 6 in Feb 1 04:58:45 localhost nova_compute[274651]: 2026-02-01 09:58:45.059 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:45 localhost systemd[1]: tmp-crun.DExDKx.mount: Deactivated successfully. Feb 1 04:58:45 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:45.531 2 INFO neutron.agent.securitygroups_rpc [None req-16f44daa-a956-4040-8610-4ccc01164dad 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['57bcc40e-59d1-4274-96b5-b777fde58e85', '436843c7-c8bd-4657-a6cb-df8e0dddd33a', '2fbf2c39-17e9-4e72-bbbb-e5125197536a']#033[00m Feb 1 04:58:45 localhost dnsmasq[323474]: exiting on receipt of SIGTERM Feb 1 04:58:45 localhost podman[323698]: 2026-02-01 09:58:45.546103746 +0000 UTC m=+0.061410970 container kill a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:58:45 localhost systemd[1]: libpod-a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67.scope: Deactivated successfully. Feb 1 04:58:45 localhost podman[323711]: 2026-02-01 09:58:45.618705279 +0000 UTC m=+0.061668687 container died a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:58:45 localhost podman[323711]: 2026-02-01 09:58:45.65022728 +0000 UTC m=+0.093190658 container cleanup a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:58:45 localhost systemd[1]: libpod-conmon-a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67.scope: Deactivated successfully. Feb 1 04:58:45 localhost ovn_controller[152492]: 2026-02-01T09:58:45Z|00433|binding|INFO|Removing iface tapbf7d0d2e-18 ovn-installed in OVS Feb 1 04:58:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:45.697 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 0e608f61-36f6-43b9-8bf6-18b483439d91 with type ""#033[00m Feb 1 04:58:45 localhost ovn_controller[152492]: 2026-02-01T09:58:45Z|00434|binding|INFO|Removing lport bf7d0d2e-188f-4e02-9010-c6fdd32fa535 ovn-installed in OVS Feb 1 04:58:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:45.700 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-79a6b2f6-ab80-415a-900e-ecc135d5887e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79a6b2f6-ab80-415a-900e-ecc135d5887e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=98dfd2fa-bc9d-4f49-9db1-ad3da76c279d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bf7d0d2e-188f-4e02-9010-c6fdd32fa535) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:45.702 158365 INFO neutron.agent.ovn.metadata.agent [-] Port bf7d0d2e-188f-4e02-9010-c6fdd32fa535 in datapath 79a6b2f6-ab80-415a-900e-ecc135d5887e unbound from our chassis#033[00m Feb 1 04:58:45 localhost podman[323718]: 2026-02-01 09:58:45.70355026 +0000 UTC m=+0.134836529 container remove a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79a6b2f6-ab80-415a-900e-ecc135d5887e, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:58:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:45.704 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 79a6b2f6-ab80-415a-900e-ecc135d5887e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:45 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:45.705 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[3a9741f2-6d10-4d95-9f0e-7e381e37e212]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:45 localhost nova_compute[274651]: 2026-02-01 09:58:45.733 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:45 localhost nova_compute[274651]: 2026-02-01 09:58:45.738 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:58:45 localhost nova_compute[274651]: 2026-02-01 09:58:45.746 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:45 localhost kernel: device tapbf7d0d2e-18 left promiscuous mode Feb 1 04:58:45 localhost nova_compute[274651]: 2026-02-01 09:58:45.761 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:45 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:45.795 259320 INFO neutron.agent.dhcp.agent [None req-f6014b37-f2f7-48d7-a7b3-60394f987d52 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:45 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:45.961 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e196 do_prune osdmap full prune enabled Feb 1 04:58:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e197 e197: 6 total, 6 up, 6 in Feb 1 04:58:46 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e197: 6 total, 6 up, 6 in Feb 1 04:58:46 localhost ovn_controller[152492]: 2026-02-01T09:58:46Z|00435|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:46 localhost nova_compute[274651]: 2026-02-01 09:58:46.243 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:46 localhost systemd[1]: var-lib-containers-storage-overlay-dc527cf19d6c83dc4f742fd9a16d8baf00815e9a16c01ab95e91f60ce69b3fb1-merged.mount: Deactivated successfully. Feb 1 04:58:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6a6855ecef165dd6a7eff3f58ba3ebb2f03a00b749099dd2228b4f31d808f67-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:46 localhost systemd[1]: run-netns-qdhcp\x2d79a6b2f6\x2dab80\x2d415a\x2d900e\x2decc135d5887e.mount: Deactivated successfully. Feb 1 04:58:46 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:46.615 2 INFO neutron.agent.securitygroups_rpc [None req-5515d09c-f3a8-4bc8-bfd1-9ff7fafccb05 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['57bcc40e-59d1-4274-96b5-b777fde58e85', '436843c7-c8bd-4657-a6cb-df8e0dddd33a']#033[00m Feb 1 04:58:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e197 do_prune osdmap full prune enabled Feb 1 04:58:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e198 e198: 6 total, 6 up, 6 in Feb 1 04:58:46 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e198: 6 total, 6 up, 6 in Feb 1 04:58:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:58:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e198 do_prune osdmap full prune enabled Feb 1 04:58:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:48.053 259320 INFO neutron.agent.linux.ip_lib [None req-765933d8-ad2e-4aa4-afd3-132bf46d8b97 - - - - - -] Device tap8a3dc76a-2d cannot be used as it has no MAC address#033[00m Feb 1 04:58:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e199 e199: 6 total, 6 up, 6 in Feb 1 04:58:48 localhost podman[323743]: 2026-02-01 09:58:48.063857205 +0000 UTC m=+0.077844515 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, version=9.7, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 04:58:48 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e199: 6 total, 6 up, 6 in Feb 1 04:58:48 localhost podman[323743]: 2026-02-01 09:58:48.083226981 +0000 UTC m=+0.097214271 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, release=1769056855, io.buildah.version=1.33.7) Feb 1 04:58:48 localhost nova_compute[274651]: 2026-02-01 09:58:48.083 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:48 localhost kernel: device tap8a3dc76a-2d entered promiscuous mode Feb 1 04:58:48 localhost ovn_controller[152492]: 2026-02-01T09:58:48Z|00436|binding|INFO|Claiming lport 8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38 for this chassis. Feb 1 04:58:48 localhost ovn_controller[152492]: 2026-02-01T09:58:48Z|00437|binding|INFO|8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38: Claiming unknown Feb 1 04:58:48 localhost nova_compute[274651]: 2026-02-01 09:58:48.092 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:48 localhost NetworkManager[5964]: [1769939928.0954] manager: (tap8a3dc76a-2d): new Generic device (/org/freedesktop/NetworkManager/Devices/74) Feb 1 04:58:48 localhost systemd-udevd[323770]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:48.109 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-f5e11d0c-83c6-4990-af75-cd61c6135ab2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5e11d0c-83c6-4990-af75-cd61c6135ab2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0f1beb9-5cdd-4a0f-b674-8b2b73310118, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:48.111 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38 in datapath f5e11d0c-83c6-4990-af75-cd61c6135ab2 bound to our chassis#033[00m Feb 1 04:58:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:48.112 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f5e11d0c-83c6-4990-af75-cd61c6135ab2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:48 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:48.114 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[53aae365-203f-4e9a-9d03-d9d6cf9fe784]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:48 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:58:48 localhost nova_compute[274651]: 2026-02-01 09:58:48.118 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:48 localhost ovn_controller[152492]: 2026-02-01T09:58:48Z|00438|binding|INFO|Setting lport 8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38 ovn-installed in OVS Feb 1 04:58:48 localhost ovn_controller[152492]: 2026-02-01T09:58:48Z|00439|binding|INFO|Setting lport 8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38 up in Southbound Feb 1 04:58:48 localhost nova_compute[274651]: 2026-02-01 09:58:48.133 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost journal[217584]: ethtool ioctl error on tap8a3dc76a-2d: No such device Feb 1 04:58:48 localhost nova_compute[274651]: 2026-02-01 09:58:48.176 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:48 localhost nova_compute[274651]: 2026-02-01 09:58:48.202 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:48 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:48.651 2 INFO neutron.agent.securitygroups_rpc [None req-ecc18215-6c26-46d4-bb1e-4f9b7e08ab87 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:49 localhost podman[323842]: Feb 1 04:58:49 localhost podman[323842]: 2026-02-01 09:58:49.046111897 +0000 UTC m=+0.089047160 container create d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e199 do_prune osdmap full prune enabled Feb 1 04:58:49 localhost systemd[1]: Started libpod-conmon-d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e.scope. Feb 1 04:58:49 localhost podman[323842]: 2026-02-01 09:58:49.004439855 +0000 UTC m=+0.047375188 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:49 localhost systemd[1]: tmp-crun.Sr0e3t.mount: Deactivated successfully. Feb 1 04:58:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e200 e200: 6 total, 6 up, 6 in Feb 1 04:58:49 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e200: 6 total, 6 up, 6 in Feb 1 04:58:49 localhost systemd[1]: Started libcrun container. Feb 1 04:58:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad10b4d707e87fc29e7d1add77c9f00f04a52fc3d073bdefa0f842c866dc6f14/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:49 localhost podman[323842]: 2026-02-01 09:58:49.147476244 +0000 UTC m=+0.190411507 container init d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:58:49 localhost podman[323842]: 2026-02-01 09:58:49.156876263 +0000 UTC m=+0.199811536 container start d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:58:49 localhost dnsmasq[323860]: started, version 2.85 cachesize 150 Feb 1 04:58:49 localhost dnsmasq[323860]: DNS service limited to local subnets Feb 1 04:58:49 localhost dnsmasq[323860]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:49 localhost dnsmasq[323860]: warning: no upstream servers configured Feb 1 04:58:49 localhost dnsmasq-dhcp[323860]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:58:49 localhost dnsmasq[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/addn_hosts - 0 addresses Feb 1 04:58:49 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/host Feb 1 04:58:49 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/opts Feb 1 04:58:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:49.222 259320 INFO neutron.agent.dhcp.agent [None req-765933d8-ad2e-4aa4-afd3-132bf46d8b97 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:47Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7c3d6d63-67a5-483c-a6a0-2fcbbb89e579, ip_allocation=immediate, mac_address=fa:16:3e:de:56:f2, name=tempest-PortsIpV6TestJSON-1937341941, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:46Z, description=, dns_domain=, id=f5e11d0c-83c6-4990-af75-cd61c6135ab2, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1381313817, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=700, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2620, status=ACTIVE, subnets=['8e62abbb-f504-4216-9a33-1cbe6b97eb24'], tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:47Z, vlan_transparent=None, network_id=f5e11d0c-83c6-4990-af75-cd61c6135ab2, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['090b75a3-5cce-4012-8bbe-4b851ef442c2'], standard_attr_id=2628, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:48Z on network f5e11d0c-83c6-4990-af75-cd61c6135ab2#033[00m Feb 1 04:58:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:49.301 259320 INFO neutron.agent.dhcp.agent [None req-c6575436-d610-44f0-941b-f8b552391758 - - - - - -] DHCP configuration for ports {'50e622c3-bb9c-4afa-b9b3-25c4d38be74c'} is completed#033[00m Feb 1 04:58:49 localhost podman[323880]: 2026-02-01 09:58:49.392749258 +0000 UTC m=+0.047514423 container kill d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:58:49 localhost dnsmasq[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/addn_hosts - 1 addresses Feb 1 04:58:49 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/host Feb 1 04:58:49 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/opts Feb 1 04:58:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:49.663 259320 INFO neutron.agent.dhcp.agent [None req-6fcd679e-08db-4a55-aff2-819b670d986c - - - - - -] DHCP configuration for ports {'7c3d6d63-67a5-483c-a6a0-2fcbbb89e579'} is completed#033[00m Feb 1 04:58:50 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:50.146 2 INFO neutron.agent.securitygroups_rpc [None req-421b8c9c-f64a-4a6d-8e56-2aad4effb91c 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']#033[00m Feb 1 04:58:50 localhost nova_compute[274651]: 2026-02-01 09:58:50.769 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:50.877 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:47Z, description=, device_id=124ed585-f401-4f61-b204-c98c75837588, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7c3d6d63-67a5-483c-a6a0-2fcbbb89e579, ip_allocation=immediate, mac_address=fa:16:3e:de:56:f2, name=tempest-PortsIpV6TestJSON-1937341941, network_id=f5e11d0c-83c6-4990-af75-cd61c6135ab2, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['090b75a3-5cce-4012-8bbe-4b851ef442c2'], standard_attr_id=2628, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:49Z on network f5e11d0c-83c6-4990-af75-cd61c6135ab2#033[00m Feb 1 04:58:51 localhost podman[323920]: 2026-02-01 09:58:51.077592409 +0000 UTC m=+0.052789124 container kill d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:51 localhost dnsmasq[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/addn_hosts - 1 addresses Feb 1 04:58:51 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/host Feb 1 04:58:51 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/opts Feb 1 04:58:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:51.366 259320 INFO neutron.agent.dhcp.agent [None req-a5821e38-58c1-47b8-b5f3-7fb90e214718 - - - - - -] DHCP configuration for ports {'7c3d6d63-67a5-483c-a6a0-2fcbbb89e579'} is completed#033[00m Feb 1 04:58:51 localhost nova_compute[274651]: 2026-02-01 09:58:51.692 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e200 do_prune osdmap full prune enabled Feb 1 04:58:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e201 e201: 6 total, 6 up, 6 in Feb 1 04:58:51 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e201: 6 total, 6 up, 6 in Feb 1 04:58:52 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:52.228 2 INFO neutron.agent.securitygroups_rpc [None req-1cc83620-5d6e-42d2-912e-0b7270f21e87 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:58:52 localhost nova_compute[274651]: 2026-02-01 09:58:52.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:52 localhost dnsmasq[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/addn_hosts - 0 addresses Feb 1 04:58:52 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/host Feb 1 04:58:52 localhost dnsmasq-dhcp[323860]: read /var/lib/neutron/dhcp/f5e11d0c-83c6-4990-af75-cd61c6135ab2/opts Feb 1 04:58:52 localhost podman[323957]: 2026-02-01 09:58:52.453403575 +0000 UTC m=+0.059941124 container kill d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 1 04:58:52 localhost nova_compute[274651]: 2026-02-01 09:58:52.761 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:52 localhost ovn_controller[152492]: 2026-02-01T09:58:52Z|00440|binding|INFO|Releasing lport 8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38 from this chassis (sb_readonly=0) Feb 1 04:58:52 localhost ovn_controller[152492]: 2026-02-01T09:58:52Z|00441|binding|INFO|Setting lport 8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38 down in Southbound Feb 1 04:58:52 localhost kernel: device tap8a3dc76a-2d left promiscuous mode Feb 1 04:58:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:52.773 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-f5e11d0c-83c6-4990-af75-cd61c6135ab2', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5e11d0c-83c6-4990-af75-cd61c6135ab2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0f1beb9-5cdd-4a0f-b674-8b2b73310118, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:52.775 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 8a3dc76a-2d66-4ee7-ac1b-8e8b78abdd38 in datapath f5e11d0c-83c6-4990-af75-cd61c6135ab2 unbound from our chassis#033[00m Feb 1 04:58:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:52.777 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f5e11d0c-83c6-4990-af75-cd61c6135ab2 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:52 localhost nova_compute[274651]: 2026-02-01 09:58:52.779 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:52 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:52.779 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[89b061e1-2f50-4eef-be3d-11d882bac0da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:53 localhost podman[236886]: time="2026-02-01T09:58:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:58:53 localhost podman[236886]: @ - - [01/Feb/2026:09:58:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160178 "" "Go-http-client/1.1" Feb 1 04:58:54 localhost podman[236886]: @ - - [01/Feb/2026:09:58:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19787 "" "Go-http-client/1.1" Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.509 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.510 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.510 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.511 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:58:54 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:54.546 259320 INFO neutron.agent.linux.ip_lib [None req-f11cf0d4-f09f-4cae-8a26-e198774f7e2d - - - - - -] Device tapf647677f-92 cannot be used as it has no MAC address#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.608 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:54 localhost kernel: device tapf647677f-92 entered promiscuous mode Feb 1 04:58:54 localhost NetworkManager[5964]: [1769939934.6169] manager: (tapf647677f-92): new Generic device (/org/freedesktop/NetworkManager/Devices/75) Feb 1 04:58:54 localhost ovn_controller[152492]: 2026-02-01T09:58:54Z|00442|binding|INFO|Claiming lport f647677f-9232-49ad-af03-658fa13da887 for this chassis. Feb 1 04:58:54 localhost ovn_controller[152492]: 2026-02-01T09:58:54Z|00443|binding|INFO|f647677f-9232-49ad-af03-658fa13da887: Claiming unknown Feb 1 04:58:54 localhost systemd-udevd[323989]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.622 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:54.629 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-c6425ceb-3ede-433f-8bed-334b148920ea', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6425ceb-3ede-433f-8bed-334b148920ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d835a02d-3e28-47a9-ba99-69bcb326f424, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f647677f-9232-49ad-af03-658fa13da887) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:54.632 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f647677f-9232-49ad-af03-658fa13da887 in datapath c6425ceb-3ede-433f-8bed-334b148920ea bound to our chassis#033[00m Feb 1 04:58:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:54.636 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c6425ceb-3ede-433f-8bed-334b148920ea or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:54.638 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[18e2a152-4402-40ab-a39f-c4ffe4cc0d06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost ovn_controller[152492]: 2026-02-01T09:58:54Z|00444|binding|INFO|Setting lport f647677f-9232-49ad-af03-658fa13da887 ovn-installed in OVS Feb 1 04:58:54 localhost ovn_controller[152492]: 2026-02-01T09:58:54Z|00445|binding|INFO|Setting lport f647677f-9232-49ad-af03-658fa13da887 up in Southbound Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.655 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost journal[217584]: ethtool ioctl error on tapf647677f-92: No such device Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.696 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:54 localhost nova_compute[274651]: 2026-02-01 09:58:54.718 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:54 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:54.940 2 INFO neutron.agent.securitygroups_rpc [req-6a252779-d066-42d3-937d-d19d3be50ea7 req-0e4e2316-508a-44f2-8edc-582240eab0d7 afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group member updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']#033[00m Feb 1 04:58:55 localhost dnsmasq[323860]: exiting on receipt of SIGTERM Feb 1 04:58:55 localhost podman[324049]: 2026-02-01 09:58:55.144118844 +0000 UTC m=+0.047284835 container kill d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:55 localhost systemd[1]: libpod-d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e.scope: Deactivated successfully. Feb 1 04:58:55 localhost podman[324067]: 2026-02-01 09:58:55.220727481 +0000 UTC m=+0.057820030 container died d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:58:55 localhost systemd[1]: tmp-crun.RZ2mpq.mount: Deactivated successfully. Feb 1 04:58:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e-userdata-shm.mount: Deactivated successfully. Feb 1 04:58:55 localhost podman[324067]: 2026-02-01 09:58:55.277804306 +0000 UTC m=+0.114896795 container remove d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5e11d0c-83c6-4990-af75-cd61c6135ab2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:58:55 localhost systemd[1]: libpod-conmon-d75aa7e3934b8712c187e3a8883675a88c3a049a5fa7041e9a1bd5609213b53e.scope: Deactivated successfully. Feb 1 04:58:55 localhost systemd[1]: var-lib-containers-storage-overlay-ad10b4d707e87fc29e7d1add77c9f00f04a52fc3d073bdefa0f842c866dc6f14-merged.mount: Deactivated successfully. Feb 1 04:58:55 localhost nova_compute[274651]: 2026-02-01 09:58:55.459 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:58:55 localhost nova_compute[274651]: 2026-02-01 09:58:55.475 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:58:55 localhost nova_compute[274651]: 2026-02-01 09:58:55.476 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:58:55 localhost nova_compute[274651]: 2026-02-01 09:58:55.476 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:55.525 259320 INFO neutron.agent.dhcp.agent [None req-c2ce1a20-6fa1-4c53-98ac-03dd4d572224 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:55 localhost systemd[1]: run-netns-qdhcp\x2df5e11d0c\x2d83c6\x2d4990\x2daf75\x2dcd61c6135ab2.mount: Deactivated successfully. Feb 1 04:58:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:58:55 localhost podman[324118]: Feb 1 04:58:55 localhost podman[324118]: 2026-02-01 09:58:55.554587689 +0000 UTC m=+0.071934004 container create b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:58:55 localhost systemd[1]: Started libpod-conmon-b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045.scope. Feb 1 04:58:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:55.594 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:55 localhost systemd[1]: Started libcrun container. Feb 1 04:58:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/897bc60d82534fc1bef13a6bb48ad6db324a274fdf3b55819f2feef382288ca2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:55 localhost podman[324118]: 2026-02-01 09:58:55.516528088 +0000 UTC m=+0.033874433 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:55 localhost podman[324129]: 2026-02-01 09:58:55.62906847 +0000 UTC m=+0.071386447 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 1 04:58:55 localhost podman[324129]: 2026-02-01 09:58:55.640250614 +0000 UTC m=+0.082568601 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true) Feb 1 04:58:55 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:58:55 localhost podman[324118]: 2026-02-01 09:58:55.664780548 +0000 UTC m=+0.182126863 container init b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:58:55 localhost podman[324118]: 2026-02-01 09:58:55.67163854 +0000 UTC m=+0.188984855 container start b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 04:58:55 localhost dnsmasq[324153]: started, version 2.85 cachesize 150 Feb 1 04:58:55 localhost dnsmasq[324153]: DNS service limited to local subnets Feb 1 04:58:55 localhost dnsmasq[324153]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:55 localhost dnsmasq[324153]: warning: no upstream servers configured Feb 1 04:58:55 localhost dnsmasq-dhcp[324153]: DHCP, static leases only on 10.103.0.0, lease time 1d Feb 1 04:58:55 localhost dnsmasq[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/addn_hosts - 0 addresses Feb 1 04:58:55 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/host Feb 1 04:58:55 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/opts Feb 1 04:58:55 localhost nova_compute[274651]: 2026-02-01 09:58:55.801 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:55.864 259320 INFO neutron.agent.dhcp.agent [None req-4bc49665-8db8-4dce-83bb-68a6e0224189 - - - - - -] DHCP configuration for ports {'1267d4f0-7ca8-475a-922f-3bd653cdba07'} is completed#033[00m Feb 1 04:58:55 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:55.911 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:58:56 localhost ovn_controller[152492]: 2026-02-01T09:58:56Z|00446|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:58:56 localhost nova_compute[274651]: 2026-02-01 09:58:56.257 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:56 localhost nova_compute[274651]: 2026-02-01 09:58:56.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:56 localhost nova_compute[274651]: 2026-02-01 09:58:56.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:58:56 localhost nova_compute[274651]: 2026-02-01 09:58:56.316 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:56 localhost systemd[1]: tmp-crun.hNi5QW.mount: Deactivated successfully. Feb 1 04:58:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:58:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e201 do_prune osdmap full prune enabled Feb 1 04:58:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e202 e202: 6 total, 6 up, 6 in Feb 1 04:58:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e202: 6 total, 6 up, 6 in Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.265 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:57 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:57.726 259320 INFO neutron.agent.linux.ip_lib [None req-2102576f-7baf-448e-adc2-7d6567719a3d - - - - - -] Device tapc7c66182-49 cannot be used as it has no MAC address#033[00m Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.750 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:57 localhost kernel: device tapc7c66182-49 entered promiscuous mode Feb 1 04:58:57 localhost NetworkManager[5964]: [1769939937.7582] manager: (tapc7c66182-49): new Generic device (/org/freedesktop/NetworkManager/Devices/76) Feb 1 04:58:57 localhost ovn_controller[152492]: 2026-02-01T09:58:57Z|00447|binding|INFO|Claiming lport c7c66182-49e6-4bde-bbaf-7f80055fee26 for this chassis. Feb 1 04:58:57 localhost ovn_controller[152492]: 2026-02-01T09:58:57Z|00448|binding|INFO|c7c66182-49e6-4bde-bbaf-7f80055fee26: Claiming unknown Feb 1 04:58:57 localhost systemd-udevd[324164]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.762 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:57.774 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-8236cd55-f7de-4bbb-a1c6-63cb66897e5f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8236cd55-f7de-4bbb-a1c6-63cb66897e5f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51441a28-7c57-4753-9c09-9a33cf92d6c8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c7c66182-49e6-4bde-bbaf-7f80055fee26) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:58:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:57.779 158365 INFO neutron.agent.ovn.metadata.agent [-] Port c7c66182-49e6-4bde-bbaf-7f80055fee26 in datapath 8236cd55-f7de-4bbb-a1c6-63cb66897e5f bound to our chassis#033[00m Feb 1 04:58:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:57.781 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:58:57 localhost ovn_metadata_agent[158360]: 2026-02-01 09:58:57.782 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a02cbe-b68a-4642-895e-85a23d785520]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost ovn_controller[152492]: 2026-02-01T09:58:57Z|00449|binding|INFO|Setting lport c7c66182-49e6-4bde-bbaf-7f80055fee26 ovn-installed in OVS Feb 1 04:58:57 localhost ovn_controller[152492]: 2026-02-01T09:58:57Z|00450|binding|INFO|Setting lport c7c66182-49e6-4bde-bbaf-7f80055fee26 up in Southbound Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.799 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.803 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost journal[217584]: ethtool ioctl error on tapc7c66182-49: No such device Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.838 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:57 localhost nova_compute[274651]: 2026-02-01 09:58:57.862 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:58:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:58.186 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:57Z, description=, device_id=0d3b238e-083b-4f7c-8e29-650b41019987, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5a8557aa-6c36-487e-a54b-b57f2ce82d07, ip_allocation=immediate, mac_address=fa:16:3e:fa:6c:cc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:51Z, description=, dns_domain=, id=c6425ceb-3ede-433f-8bed-334b148920ea, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2018817178, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11581, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2647, status=ACTIVE, subnets=['a9ba2e35-0fc0-4b81-aded-d26bc96fdccf'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:53Z, vlan_transparent=None, network_id=c6425ceb-3ede-433f-8bed-334b148920ea, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2678, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:57Z on network c6425ceb-3ede-433f-8bed-334b148920ea#033[00m Feb 1 04:58:58 localhost systemd[1]: tmp-crun.nGxs3c.mount: Deactivated successfully. Feb 1 04:58:58 localhost dnsmasq[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/addn_hosts - 1 addresses Feb 1 04:58:58 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/host Feb 1 04:58:58 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/opts Feb 1 04:58:58 localhost podman[324228]: 2026-02-01 09:58:58.453827432 +0000 UTC m=+0.070807829 container kill b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:58:58 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:58.554 2 INFO neutron.agent.securitygroups_rpc [None req-fa4d9652-c836-4282-9ccf-0ce3333cfc7d 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['9475ea4c-43e5-4601-aa09-56b92b5b1098']#033[00m Feb 1 04:58:58 localhost podman[324270]: Feb 1 04:58:58 localhost podman[324270]: 2026-02-01 09:58:58.689204531 +0000 UTC m=+0.075745790 container create bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:58:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:58.733 259320 INFO neutron.agent.dhcp.agent [None req-c45ba9a3-4c1b-4fc2-a2d4-24c4083f494f - - - - - -] DHCP configuration for ports {'5a8557aa-6c36-487e-a54b-b57f2ce82d07'} is completed#033[00m Feb 1 04:58:58 localhost systemd[1]: Started libpod-conmon-bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af.scope. Feb 1 04:58:58 localhost podman[324270]: 2026-02-01 09:58:58.647441497 +0000 UTC m=+0.033982776 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:58:58 localhost systemd[1]: Started libcrun container. Feb 1 04:58:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84a6d81d76058e0f2776d647be27fd501aa1f540f419cae1880fd294d9873b48/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:58:58 localhost podman[324270]: 2026-02-01 09:58:58.763815276 +0000 UTC m=+0.150356525 container init bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127) Feb 1 04:58:58 localhost podman[324270]: 2026-02-01 09:58:58.777749285 +0000 UTC m=+0.164290534 container start bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:58:58 localhost dnsmasq[324289]: started, version 2.85 cachesize 150 Feb 1 04:58:58 localhost dnsmasq[324289]: DNS service limited to local subnets Feb 1 04:58:58 localhost dnsmasq[324289]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:58:58 localhost dnsmasq[324289]: warning: no upstream servers configured Feb 1 04:58:58 localhost dnsmasq-dhcp[324289]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:58:58 localhost dnsmasq[324289]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 0 addresses Feb 1 04:58:58 localhost dnsmasq-dhcp[324289]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:58:58 localhost dnsmasq-dhcp[324289]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:58:58 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:58.889 2 INFO neutron.agent.securitygroups_rpc [None req-30e16878-3a93-4ab5-adf2-36d2809551ae 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['9475ea4c-43e5-4601-aa09-56b92b5b1098']#033[00m Feb 1 04:58:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:59.032 259320 INFO neutron.agent.dhcp.agent [None req-42bc0977-460a-43c0-bc0a-816fe86d1f5b - - - - - -] DHCP configuration for ports {'e5c15d2f-6a4e-437f-9a05-5dbbb79b3cd9', 'eae9c3d8-c5cc-4401-abe4-334f5c801488'} is completed#033[00m Feb 1 04:58:59 localhost neutron_sriov_agent[252126]: 2026-02-01 09:58:59.212 2 INFO neutron.agent.securitygroups_rpc [None req-da96df46-f62f-42b1-9cd1-d7e6a5376fca d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['5471bfa5-0ba1-439c-b208-7f1eef47ebe2']#033[00m Feb 1 04:58:59 localhost nova_compute[274651]: 2026-02-01 09:58:59.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:58:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:59.302 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:58Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d9df4fe3-66ee-4f7b-b6d9-bee1e2cb02e0, ip_allocation=immediate, mac_address=fa:16:3e:06:73:78, name=tempest-PortsIpV6TestJSON-1832439101, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:10Z, description=, dns_domain=, id=8236cd55-f7de-4bbb-a1c6-63cb66897e5f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-407151453, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10540, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2390, status=ACTIVE, subnets=['8fcaa75d-2e3a-46ab-a47c-7e6d5a894e48'], tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:56Z, vlan_transparent=None, network_id=8236cd55-f7de-4bbb-a1c6-63cb66897e5f, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5471bfa5-0ba1-439c-b208-7f1eef47ebe2'], standard_attr_id=2689, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:58Z on network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f#033[00m Feb 1 04:58:59 localhost dnsmasq[324289]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 1 addresses Feb 1 04:58:59 localhost dnsmasq-dhcp[324289]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:58:59 localhost podman[324308]: 2026-02-01 09:58:59.501173025 +0000 UTC m=+0.060323237 container kill bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:58:59 localhost dnsmasq-dhcp[324289]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:58:59 localhost systemd[1]: tmp-crun.aRrMoR.mount: Deactivated successfully. Feb 1 04:58:59 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:58:59.798 259320 INFO neutron.agent.dhcp.agent [None req-2b073062-f513-47da-9fe9-8c6bcef2f9ba - - - - - -] DHCP configuration for ports {'d9df4fe3-66ee-4f7b-b6d9-bee1e2cb02e0'} is completed#033[00m Feb 1 04:58:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e202 do_prune osdmap full prune enabled Feb 1 04:58:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e203 e203: 6 total, 6 up, 6 in Feb 1 04:58:59 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e203: 6 total, 6 up, 6 in Feb 1 04:59:00 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:00.589 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:57Z, description=, device_id=0d3b238e-083b-4f7c-8e29-650b41019987, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5a8557aa-6c36-487e-a54b-b57f2ce82d07, ip_allocation=immediate, mac_address=fa:16:3e:fa:6c:cc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:51Z, description=, dns_domain=, id=c6425ceb-3ede-433f-8bed-334b148920ea, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-2018817178, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11581, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2647, status=ACTIVE, subnets=['a9ba2e35-0fc0-4b81-aded-d26bc96fdccf'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:53Z, vlan_transparent=None, network_id=c6425ceb-3ede-433f-8bed-334b148920ea, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2678, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:57Z on network c6425ceb-3ede-433f-8bed-334b148920ea#033[00m Feb 1 04:59:00 localhost nova_compute[274651]: 2026-02-01 09:59:00.836 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:00 localhost systemd[1]: tmp-crun.hlmrH0.mount: Deactivated successfully. Feb 1 04:59:00 localhost podman[324346]: 2026-02-01 09:59:00.866914541 +0000 UTC m=+0.093099854 container kill b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:59:00 localhost dnsmasq[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/addn_hosts - 1 addresses Feb 1 04:59:00 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/host Feb 1 04:59:00 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/opts Feb 1 04:59:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e203 do_prune osdmap full prune enabled Feb 1 04:59:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e204 e204: 6 total, 6 up, 6 in Feb 1 04:59:00 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e204: 6 total, 6 up, 6 in Feb 1 04:59:01 localhost dnsmasq[324289]: exiting on receipt of SIGTERM Feb 1 04:59:01 localhost podman[324383]: 2026-02-01 09:59:01.050960562 +0000 UTC m=+0.058852381 container kill bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:59:01 localhost systemd[1]: libpod-bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af.scope: Deactivated successfully. Feb 1 04:59:01 localhost podman[324398]: 2026-02-01 09:59:01.120836991 +0000 UTC m=+0.049812563 container died bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:59:01 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:01.140 259320 INFO neutron.agent.dhcp.agent [None req-9c23f638-25cb-4f21-8cab-e67614364dbc - - - - - -] DHCP configuration for ports {'5a8557aa-6c36-487e-a54b-b57f2ce82d07'} is completed#033[00m Feb 1 04:59:01 localhost podman[324398]: 2026-02-01 09:59:01.173728208 +0000 UTC m=+0.102703740 container remove bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:59:01 localhost systemd[1]: libpod-conmon-bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af.scope: Deactivated successfully. Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.290 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.291 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.291 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.292 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.292 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:59:01 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:01.343 2 INFO neutron.agent.securitygroups_rpc [None req-c763a1d5-029b-4398-9f3b-b445d5b844aa d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['5471bfa5-0ba1-439c-b208-7f1eef47ebe2', '4d2012b8-f333-4b7a-9cf4-a971a1fa768f']#033[00m Feb 1 04:59:01 localhost openstack_network_exporter[239441]: ERROR 09:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:59:01 localhost openstack_network_exporter[239441]: Feb 1 04:59:01 localhost openstack_network_exporter[239441]: ERROR 09:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:59:01 localhost openstack_network_exporter[239441]: Feb 1 04:59:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:01 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:01.777 2 INFO neutron.agent.securitygroups_rpc [None req-1a8757de-ce31-4291-bd6c-41d255499299 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:59:01 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3304763550' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.820 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:59:01 localhost systemd[1]: var-lib-containers-storage-overlay-84a6d81d76058e0f2776d647be27fd501aa1f540f419cae1880fd294d9873b48-merged.mount: Deactivated successfully. Feb 1 04:59:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bbacfbc1fb23496375bbef275318241ba79dc3e6c7b6438482a4853e85ab95af-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.914 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:59:01 localhost nova_compute[274651]: 2026-02-01 09:59:01.914 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 04:59:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e204 do_prune osdmap full prune enabled Feb 1 04:59:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e205 e205: 6 total, 6 up, 6 in Feb 1 04:59:02 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e205: 6 total, 6 up, 6 in Feb 1 04:59:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:59:02 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.136 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.137 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11225MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.138 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.138 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.206 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.207 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.207 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.247 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 04:59:02 localhost podman[324515]: Feb 1 04:59:02 localhost podman[324515]: 2026-02-01 09:59:02.42941816 +0000 UTC m=+0.064220437 container create 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 04:59:02 localhost systemd[1]: Started libpod-conmon-36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd.scope. Feb 1 04:59:02 localhost systemd[1]: Started libcrun container. Feb 1 04:59:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46bc393696e8876c1fbd2f62eccf8f6c37cdd956a3dbdf8cdad00c2b97b890a1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:59:02 localhost podman[324515]: 2026-02-01 09:59:02.493359737 +0000 UTC m=+0.128162044 container init 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:59:02 localhost podman[324515]: 2026-02-01 09:59:02.398647553 +0000 UTC m=+0.033449840 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:59:02 localhost podman[324515]: 2026-02-01 09:59:02.502602601 +0000 UTC m=+0.137404918 container start 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 04:59:02 localhost dnsmasq[324533]: started, version 2.85 cachesize 150 Feb 1 04:59:02 localhost dnsmasq[324533]: DNS service limited to local subnets Feb 1 04:59:02 localhost dnsmasq[324533]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:59:02 localhost dnsmasq[324533]: warning: no upstream servers configured Feb 1 04:59:02 localhost dnsmasq-dhcp[324533]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:59:02 localhost dnsmasq-dhcp[324533]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:59:02 localhost dnsmasq[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 1 addresses Feb 1 04:59:02 localhost dnsmasq-dhcp[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:02 localhost dnsmasq-dhcp[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:02 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:02.551 2 INFO neutron.agent.securitygroups_rpc [None req-c9b67f75-2c82-4fb2-8eff-26d6d15e8cf1 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['4d2012b8-f333-4b7a-9cf4-a971a1fa768f']#033[00m Feb 1 04:59:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:59:02 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3141389870' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:59:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:59:02 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3141389870' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:59:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:02.567 259320 INFO neutron.agent.dhcp.agent [None req-4400d53c-b4df-472a-9c48-f0082859bb8c - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:58Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d9df4fe3-66ee-4f7b-b6d9-bee1e2cb02e0, ip_allocation=immediate, mac_address=fa:16:3e:06:73:78, name=tempest-PortsIpV6TestJSON-864587177, network_id=8236cd55-f7de-4bbb-a1c6-63cb66897e5f, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['4d2012b8-f333-4b7a-9cf4-a971a1fa768f'], standard_attr_id=2689, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:59:01Z on network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f#033[00m Feb 1 04:59:02 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:02.624 2 INFO neutron.agent.securitygroups_rpc [None req-77c4ed81-7ddc-4083-9aca-2d68314f54e3 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 04:59:02 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/565340341' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.672 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.678 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.692 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.693 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 04:59:02 localhost nova_compute[274651]: 2026-02-01 09:59:02.694 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:59:02 localhost dnsmasq[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 1 addresses Feb 1 04:59:02 localhost dnsmasq-dhcp[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:02 localhost dnsmasq-dhcp[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:02 localhost podman[324554]: 2026-02-01 09:59:02.752150916 +0000 UTC m=+0.056633083 container kill 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 04:59:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:02.766 259320 INFO neutron.agent.dhcp.agent [None req-78d7e788-a2e1-4775-98d8-0b417428d407 - - - - - -] DHCP configuration for ports {'e5c15d2f-6a4e-437f-9a05-5dbbb79b3cd9', 'c7c66182-49e6-4bde-bbaf-7f80055fee26', 'd9df4fe3-66ee-4f7b-b6d9-bee1e2cb02e0', 'eae9c3d8-c5cc-4401-abe4-334f5c801488'} is completed#033[00m Feb 1 04:59:02 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:02.855 2 INFO neutron.agent.securitygroups_rpc [None req-648bce3d-1b69-4a9f-8926-09639fc82cde 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:03 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:03.013 259320 INFO neutron.agent.dhcp.agent [None req-baccd3bc-0b75-4686-9693-32b521fb2faa - - - - - -] DHCP configuration for ports {'d9df4fe3-66ee-4f7b-b6d9-bee1e2cb02e0'} is completed#033[00m Feb 1 04:59:03 localhost dnsmasq[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 0 addresses Feb 1 04:59:03 localhost podman[324591]: 2026-02-01 09:59:03.083965472 +0000 UTC m=+0.067125426 container kill 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:59:03 localhost dnsmasq-dhcp[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:03 localhost dnsmasq-dhcp[324533]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.530 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.531 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.550 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3c2e95f-bf58-44e2-ba82-441262b24d76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:59:03.532139', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a398975e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.769506407, 'message_signature': 'a90a7700515331864e0286283deeacbe1edfc68bea79e34dd4db3cd6b02865e8'}]}, 'timestamp': '2026-02-01 09:59:03.551106', '_unique_id': '9919c5ab8e6e45349ec8790bc8087804'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.554 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.557 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '431830f5-c8aa-4544-a8f6-5cc81a5fe1b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.554259', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a399aa40-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': '673970b17c5905e05c744c2bbe571d7ff6a718019c9f49da65df72a655ef465e'}]}, 'timestamp': '2026-02-01 09:59:03.558103', '_unique_id': 'ec93f370533e442abc4878b53ed9466a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.559 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.560 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.571 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.572 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2663d11e-c434-49fe-a6ba-e2c5241fe03c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.561035', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a39bddc4-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.780482875, 'message_signature': 'f746533781af05173af8749370d0b769ec8ce3dce54dc2655b8f80aa0eaf9211'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.561035', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a39bf3b8-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.780482875, 'message_signature': 'd8cf0ec46f1421327deab315bd68d507a67f50b0f5d861c3a82c99ff3643ce97'}]}, 'timestamp': '2026-02-01 09:59:03.573039', '_unique_id': '7de1c24339ef4eb6889451f7021bcca9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.574 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.575 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.576 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.576 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2270a6f1-11c8-4ecb-abf9-0322c39bd155', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.576279', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a39c88a0-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': '2e21a6bef97301df23dc726ff1be8657a78ab6533b277e8939706815f58ac4c3'}]}, 'timestamp': '2026-02-01 09:59:03.576839', '_unique_id': '9694134f91e54b57b4891db738cb3f86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.577 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f411412-b675-4d6d-b29f-7e76cb14838d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.580295', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a39d323c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': 'e4c02fc15c3c91694c6acc1b5467664ec30e26e978a7f5fa102a3c4d90cf05f5'}]}, 'timestamp': '2026-02-01 09:59:03.581295', '_unique_id': '95b8afd5a8ca49afb40e4a834f675990'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.615 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc08a7e1-4b02-49f7-81fd-f1d87aa35a54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.584161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a267d4-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': 'a79d1d8f8969312cf1cd3d413fb0055b0cdfc23cabe6b2026c60ffa05f672d21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.584161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a27bca-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': 'c318573103decda54b3db52542c8afb830d5dbc55540196e200d2be157df3535'}]}, 'timestamp': '2026-02-01 09:59:03.615806', '_unique_id': '4253b2ad0e084a06866df810949a42bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.616 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.618 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.618 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.618 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0defdb5b-2481-4a33-ba9e-30121b15a168', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.618330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a2f064-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.780482875, 'message_signature': '9e1218f3cfe2f54a2bfeb01888316fbd21a0e05ff0bb7985fd6b490c460a519a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.618330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a3023e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.780482875, 'message_signature': '26453b63921d9a46716c51f9d6cc92e97f3b7d71eac0d3274b732b80fc3dc552'}]}, 'timestamp': '2026-02-01 09:59:03.619244', '_unique_id': '5bfb8687d9e5407c861ae757d28cddb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.620 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.621 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.621 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.622 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2fcfde8-abbc-47ec-8f94-072689b94fc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.622078', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a3a382fe-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': '9d1451f4713e74f34432217a719c72d66b2137eeb40699e64dfcfaba2db3672c'}]}, 'timestamp': '2026-02-01 09:59:03.622567', '_unique_id': '9815d32eea22411f868a567e19d5eae5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.624 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb1ad8dd-d99a-447f-b3e6-0d69bffcf348', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.624884', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a3a3f16c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': '14f6784f69028d1d416d58b1c2812d5cffc01f8d3a33f6eb60041ec4d320bbf4'}]}, 'timestamp': '2026-02-01 09:59:03.625393', '_unique_id': 'b7d93d1b11b147049fcd0e941fb07f93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.626 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.627 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.627 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a983740a-a495-44f9-9ab3-f9fa3d2b26dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.627670', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a3a45cf6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': 'd2ee32d67c30b891fa70266413d2d5b165c75f491db22c517682b4aa41623de0'}]}, 'timestamp': '2026-02-01 09:59:03.628177', '_unique_id': '90964d7380ef4c178819cd5b012d7f1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.629 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.630 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.630 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dedcda25-71eb-4e2c-9374-286bb2a0b43c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.630306', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a3a4c3c6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': '037deebc87f4115584c73755d1810f43e0e5efac9d361543f758b37e7d1e63a6'}]}, 'timestamp': '2026-02-01 09:59:03.630778', '_unique_id': '91165c488db94ce490207a3006055313'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.631 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.633 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0537824a-198d-4e12-ae39-38684ea6d6af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.632925', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a52b18-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': '5c38ba22f01298fba4073c6b3e12ac73efe88735993950fda0d3b3409adb2590'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.632925', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a53b9e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': '8872cdd96d21f4107581e48f72ae2c9899e0b25f03241a43ca671d192b0cc903'}]}, 'timestamp': '2026-02-01 09:59:03.633812', '_unique_id': '3de58d8d391c49feb5c0e672ac469388'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.635 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41dcc315-23cb-490c-a660-a8fe66fa3ca1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.636040', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a5a5c0-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.780482875, 'message_signature': '4af2fea8eb29e61f2dc5576a5b60bc51740d21116a76aad3ce2e30744a725542'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.636040', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a5b754-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.780482875, 'message_signature': '5819ab3051f1786155a5eede6e24424275963ab4272ce771d9a4ab1a766b1e2c'}]}, 'timestamp': '2026-02-01 09:59:03.636981', '_unique_id': '5550382f731e42cdbd6e9a3a7065cf5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.639 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3342b32-6768-46ac-beb8-26d629df2ef1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.639182', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a61f3c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': 'f20179cae5fef27b6e39e383813cdd5b706b41d878bf74cd8e94c252e9f17cca'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.639182', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a62fd6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': '426889fc0876b4c40593213c304e4751fc66da204a4072afcd0fd4e168969259'}]}, 'timestamp': '2026-02-01 09:59:03.640100', '_unique_id': '1f767eacf9de425a9a6a6fea0cbd2c51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.642 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48a74c71-8948-4c3e-a064-b41d17f6af5d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.642265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a69714-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': '49b3447dd48338620a4d9416f8580daa3dce174f0557e23471c9bcdebb74d153'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.642265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a6a7d6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': 'e7a919a6037224d04dc2c70d920891a4019e3ca5c2d85dd103508b271e415057'}]}, 'timestamp': '2026-02-01 09:59:03.643166', '_unique_id': '5b8f8e32552b4a6ca5096eb00d351c14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.645 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa055026-711f-49eb-85c9-e7e04c9b731f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.645572', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a3a71824-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': 'd78d01e4b4a56738313bdca0cf2aaf9ef2d378d9dbd56328534d09014a6039d8'}]}, 'timestamp': '2026-02-01 09:59:03.646076', '_unique_id': 'e3e36200a9bf46d1bfae82dfed0e23ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.646 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.648 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.648 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 16880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4c2c478-3b44-42c1-8605-b756f3330d77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16880000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T09:59:03.648213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a3a77ef4-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.769506407, 'message_signature': 'f4e1d6da24e614ce2042f38315698c09d85d90c44236dcc008f2437829eb1102'}]}, 'timestamp': '2026-02-01 09:59:03.648658', '_unique_id': 'fe9cf3ce747841dbb1e58aafcfb02ab4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.650 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b25f9aed-26bb-4a14-9813-7aeadff96cf6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.650761', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a3a7e3f8-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': '9bf2fca0f8220c38a33ce466377c8a661c85c894706283a9eaf9ea744f78b8a5'}]}, 'timestamp': '2026-02-01 09:59:03.651262', '_unique_id': '417e1d8113604e6abfa221a2a2565783'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.652 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.653 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.653 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.654 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30b9dcfe-cdbf-43d9-aff4-1fa80ebd58a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.653531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a851f8-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': 'a6363635919be5685884b0f274143b4914496c6857fd097eac4f7b33878a6b76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.653531', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a86594-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': '68ba46f8e42fc041bfcf642c3a9f7fc942413e398279349d2001bf28d0c05063'}]}, 'timestamp': '2026-02-01 09:59:03.654628', '_unique_id': '5ff274bcc0ec4285b07117c0cf57433d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.655 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.656 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.657 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f1838f-9c1e-4fa8-911d-63c71fb85f2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T09:59:03.657254', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'a3a8e154-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.773730648, 'message_signature': 'c6a0abd6415a6859709233567629d96fb1d54df188aafe3ec90284c017d3e88c'}]}, 'timestamp': '2026-02-01 09:59:03.657748', '_unique_id': 'd774621c26024b389ace5ac176587308'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.658 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.659 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.660 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.660 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a29ad23-8cc0-490e-8142-e7bb3550fbb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T09:59:03.659956', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a3a94bc6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': '6ace3c3d1ea90d51617c3eeca4cfe61ba3d882f47c64a9d8a0983029fdad64a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T09:59:03.659956', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a3a95d82-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12037.803640797, 'message_signature': '8b5cc464965f0ddb9d7b317684f5c46a4977374c34f48ee92c1f658b5970bf0b'}]}, 'timestamp': '2026-02-01 09:59:03.660947', '_unique_id': 'd6c34a49219d48c2a10966b44f00df89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging yield Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 04:59:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 09:59:03.662 12 ERROR oslo_messaging.notify.messaging Feb 1 04:59:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:04.063 2 INFO neutron.agent.securitygroups_rpc [None req-497c1bff-4722-4a35-9304-716d637751a5 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:04 localhost nova_compute[274651]: 2026-02-01 09:59:04.284 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:04.483 2 INFO neutron.agent.securitygroups_rpc [None req-91776191-3760-464d-b953-295169a6f779 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:59:04 localhost podman[324613]: 2026-02-01 09:59:04.747235039 +0000 UTC m=+0.099181211 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 04:59:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:04.752 2 INFO neutron.agent.securitygroups_rpc [None req-95923de5-5aba-4d8b-a050-6896df644f34 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:04 localhost podman[324613]: 2026-02-01 09:59:04.780562104 +0000 UTC m=+0.132508266 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 04:59:04 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:59:04 localhost dnsmasq[324533]: exiting on receipt of SIGTERM Feb 1 04:59:04 localhost podman[324653]: 2026-02-01 09:59:04.929977691 +0000 UTC m=+0.055393205 container kill 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:59:04 localhost systemd[1]: libpod-36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd.scope: Deactivated successfully. Feb 1 04:59:04 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:04.986 2 INFO neutron.agent.securitygroups_rpc [None req-99fb79a2-24fb-465b-80f5-1d805114aafe 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:05 localhost podman[324667]: 2026-02-01 09:59:05.003802401 +0000 UTC m=+0.056390786 container died 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 04:59:05 localhost podman[324667]: 2026-02-01 09:59:05.032762051 +0000 UTC m=+0.085350406 container cleanup 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:59:05 localhost systemd[1]: libpod-conmon-36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd.scope: Deactivated successfully. Feb 1 04:59:05 localhost podman[324668]: 2026-02-01 09:59:05.082932264 +0000 UTC m=+0.129902476 container remove 36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 04:59:05 localhost nova_compute[274651]: 2026-02-01 09:59:05.695 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:05 localhost systemd[1]: var-lib-containers-storage-overlay-46bc393696e8876c1fbd2f62eccf8f6c37cdd956a3dbdf8cdad00c2b97b890a1-merged.mount: Deactivated successfully. Feb 1 04:59:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36eaced8631eb8fbaf31c03396c50d51c8ac0470a67d30444e7b0eb1997277fd-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:05 localhost nova_compute[274651]: 2026-02-01 09:59:05.886 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:05 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:05.935 2 INFO neutron.agent.securitygroups_rpc [None req-47fc05ae-3df2-436b-85c1-a738a10459e4 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:05 localhost podman[324747]: Feb 1 04:59:06 localhost podman[324747]: 2026-02-01 09:59:06.006377347 +0000 UTC m=+0.095191329 container create c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:59:06 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:06.008 259320 INFO neutron.agent.linux.ip_lib [None req-81a7b585-72ab-4ef3-934e-ebf46e1f8275 - - - - - -] Device tap40a6b4fe-ca cannot be used as it has no MAC address#033[00m Feb 1 04:59:06 localhost nova_compute[274651]: 2026-02-01 09:59:06.042 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:06 localhost systemd[1]: Started libpod-conmon-c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9.scope. Feb 1 04:59:06 localhost kernel: device tap40a6b4fe-ca entered promiscuous mode Feb 1 04:59:06 localhost NetworkManager[5964]: [1769939946.0526] manager: (tap40a6b4fe-ca): new Generic device (/org/freedesktop/NetworkManager/Devices/77) Feb 1 04:59:06 localhost systemd-udevd[324772]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:59:06 localhost nova_compute[274651]: 2026-02-01 09:59:06.053 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:06 localhost ovn_controller[152492]: 2026-02-01T09:59:06Z|00451|binding|INFO|Claiming lport 40a6b4fe-ca01-4182-9750-1e2bb438a1ea for this chassis. Feb 1 04:59:06 localhost ovn_controller[152492]: 2026-02-01T09:59:06Z|00452|binding|INFO|40a6b4fe-ca01-4182-9750-1e2bb438a1ea: Claiming unknown Feb 1 04:59:06 localhost podman[324747]: 2026-02-01 09:59:05.957944878 +0000 UTC m=+0.046758860 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:59:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:06.071 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-de7f0928-15ba-48be-888b-30e3dafdaa75', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de7f0928-15ba-48be-888b-30e3dafdaa75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bdc37ab-2923-430c-80f3-a25c674895bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=40a6b4fe-ca01-4182-9750-1e2bb438a1ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:06.073 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 40a6b4fe-ca01-4182-9750-1e2bb438a1ea in datapath de7f0928-15ba-48be-888b-30e3dafdaa75 bound to our chassis#033[00m Feb 1 04:59:06 localhost systemd[1]: Started libcrun container. Feb 1 04:59:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:06.076 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network de7f0928-15ba-48be-888b-30e3dafdaa75 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:59:06 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:06.077 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[7196d548-6d05-4a40-a91b-67bd52f8d430]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:59:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5231ca82c9f803945904c43420657264a4ce33a76676017273a9758328c4e13d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:59:06 localhost podman[324747]: 2026-02-01 09:59:06.088892715 +0000 UTC m=+0.177706627 container init c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:59:06 localhost ovn_controller[152492]: 2026-02-01T09:59:06Z|00453|binding|INFO|Setting lport 40a6b4fe-ca01-4182-9750-1e2bb438a1ea ovn-installed in OVS Feb 1 04:59:06 localhost ovn_controller[152492]: 2026-02-01T09:59:06Z|00454|binding|INFO|Setting lport 40a6b4fe-ca01-4182-9750-1e2bb438a1ea up in Southbound Feb 1 04:59:06 localhost nova_compute[274651]: 2026-02-01 09:59:06.093 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:06 localhost podman[324747]: 2026-02-01 09:59:06.098106059 +0000 UTC m=+0.186919981 container start c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:59:06 localhost dnsmasq[324778]: started, version 2.85 cachesize 150 Feb 1 04:59:06 localhost dnsmasq[324778]: DNS service limited to local subnets Feb 1 04:59:06 localhost dnsmasq[324778]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:59:06 localhost dnsmasq[324778]: warning: no upstream servers configured Feb 1 04:59:06 localhost dnsmasq-dhcp[324778]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:59:06 localhost dnsmasq[324778]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 0 addresses Feb 1 04:59:06 localhost dnsmasq-dhcp[324778]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:06 localhost dnsmasq-dhcp[324778]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:06 localhost nova_compute[274651]: 2026-02-01 09:59:06.133 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:06 localhost nova_compute[274651]: 2026-02-01 09:59:06.159 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:06 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:06.340 259320 INFO neutron.agent.dhcp.agent [None req-ce1f153b-b576-42e2-9444-4207d10afe0c - - - - - -] DHCP configuration for ports {'e5c15d2f-6a4e-437f-9a05-5dbbb79b3cd9', 'c7c66182-49e6-4bde-bbaf-7f80055fee26', 'eae9c3d8-c5cc-4401-abe4-334f5c801488'} is completed#033[00m Feb 1 04:59:06 localhost dnsmasq[324778]: exiting on receipt of SIGTERM Feb 1 04:59:06 localhost podman[324812]: 2026-02-01 09:59:06.438063505 +0000 UTC m=+0.055299322 container kill c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:59:06 localhost systemd[1]: libpod-c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9.scope: Deactivated successfully. Feb 1 04:59:06 localhost podman[324827]: 2026-02-01 09:59:06.511131042 +0000 UTC m=+0.060756980 container died c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:59:06 localhost podman[324827]: 2026-02-01 09:59:06.545374255 +0000 UTC m=+0.095000133 container cleanup c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Feb 1 04:59:06 localhost systemd[1]: libpod-conmon-c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9.scope: Deactivated successfully. Feb 1 04:59:06 localhost podman[324830]: 2026-02-01 09:59:06.582188547 +0000 UTC m=+0.126049248 container remove c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:59:06 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:06.681 2 INFO neutron.agent.securitygroups_rpc [None req-a0d41d31-bc4d-4dff-9a1f-c5fa8673a648 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:06 localhost systemd[1]: var-lib-containers-storage-overlay-5231ca82c9f803945904c43420657264a4ce33a76676017273a9758328c4e13d-merged.mount: Deactivated successfully. Feb 1 04:59:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c294d2c6996dc974ebd4fd55b9220bf4fdf452f48c0fd956601af7d707660ae9-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e205 do_prune osdmap full prune enabled Feb 1 04:59:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e206 e206: 6 total, 6 up, 6 in Feb 1 04:59:06 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e206: 6 total, 6 up, 6 in Feb 1 04:59:06 localhost dnsmasq[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/addn_hosts - 0 addresses Feb 1 04:59:06 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/host Feb 1 04:59:06 localhost podman[324883]: 2026-02-01 09:59:06.949284899 +0000 UTC m=+0.081676584 container kill b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:59:06 localhost dnsmasq-dhcp[324153]: read /var/lib/neutron/dhcp/c6425ceb-3ede-433f-8bed-334b148920ea/opts Feb 1 04:59:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:59:06 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:06.990 2 INFO neutron.agent.securitygroups_rpc [None req-1e2106aa-eb56-4244-87ab-21e381223ca0 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']#033[00m Feb 1 04:59:07 localhost podman[324901]: 2026-02-01 09:59:07.117563795 +0000 UTC m=+0.145182047 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:59:07 localhost podman[324901]: 2026-02-01 09:59:07.148127545 +0000 UTC m=+0.175745797 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127) Feb 1 04:59:07 localhost podman[324931]: Feb 1 04:59:07 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:59:07 localhost podman[324931]: 2026-02-01 09:59:07.161369181 +0000 UTC m=+0.095220579 container create ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 1 04:59:07 localhost systemd[1]: Started libpod-conmon-ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde.scope. Feb 1 04:59:07 localhost systemd[1]: Started libcrun container. Feb 1 04:59:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cdb98e0fc199f7d7648d1f6afac75f517d11b4e3b5202c181c31b442ccd66cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:59:07 localhost podman[324931]: 2026-02-01 09:59:07.213225007 +0000 UTC m=+0.147076445 container init ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:59:07 localhost podman[324931]: 2026-02-01 09:59:07.225489544 +0000 UTC m=+0.159340972 container start ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:59:07 localhost podman[324931]: 2026-02-01 09:59:07.126047546 +0000 UTC m=+0.059898974 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:59:07 localhost dnsmasq[324968]: started, version 2.85 cachesize 150 Feb 1 04:59:07 localhost dnsmasq[324968]: DNS service limited to local subnets Feb 1 04:59:07 localhost dnsmasq[324968]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:59:07 localhost dnsmasq[324968]: warning: no upstream servers configured Feb 1 04:59:07 localhost dnsmasq-dhcp[324968]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:59:07 localhost dnsmasq[324968]: read /var/lib/neutron/dhcp/de7f0928-15ba-48be-888b-30e3dafdaa75/addn_hosts - 0 addresses Feb 1 04:59:07 localhost dnsmasq-dhcp[324968]: read /var/lib/neutron/dhcp/de7f0928-15ba-48be-888b-30e3dafdaa75/host Feb 1 04:59:07 localhost dnsmasq-dhcp[324968]: read /var/lib/neutron/dhcp/de7f0928-15ba-48be-888b-30e3dafdaa75/opts Feb 1 04:59:07 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:07.385 259320 INFO neutron.agent.dhcp.agent [None req-596a3434-40f6-4982-8c43-b0fd94ee06e9 - - - - - -] DHCP configuration for ports {'421616f2-bfd1-4720-be22-270495136549'} is completed#033[00m Feb 1 04:59:07 localhost nova_compute[274651]: 2026-02-01 09:59:07.402 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:07 localhost kernel: device tapf647677f-92 left promiscuous mode Feb 1 04:59:07 localhost ovn_controller[152492]: 2026-02-01T09:59:07Z|00455|binding|INFO|Releasing lport f647677f-9232-49ad-af03-658fa13da887 from this chassis (sb_readonly=0) Feb 1 04:59:07 localhost ovn_controller[152492]: 2026-02-01T09:59:07Z|00456|binding|INFO|Setting lport f647677f-9232-49ad-af03-658fa13da887 down in Southbound Feb 1 04:59:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:07.413 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-c6425ceb-3ede-433f-8bed-334b148920ea', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c6425ceb-3ede-433f-8bed-334b148920ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d835a02d-3e28-47a9-ba99-69bcb326f424, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f647677f-9232-49ad-af03-658fa13da887) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:07.415 158365 INFO neutron.agent.ovn.metadata.agent [-] Port f647677f-9232-49ad-af03-658fa13da887 in datapath c6425ceb-3ede-433f-8bed-334b148920ea unbound from our chassis#033[00m Feb 1 04:59:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:07.417 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c6425ceb-3ede-433f-8bed-334b148920ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:59:07 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:07.418 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[67c39210-b415-4cf7-b98e-42a5dfd6a727]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:59:07 localhost nova_compute[274651]: 2026-02-01 09:59:07.421 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:07 localhost ovn_controller[152492]: 2026-02-01T09:59:07Z|00457|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:07 localhost nova_compute[274651]: 2026-02-01 09:59:07.705 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:07 localhost systemd[1]: tmp-crun.CXa2Cl.mount: Deactivated successfully. Feb 1 04:59:07 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:07.889 2 INFO neutron.agent.securitygroups_rpc [None req-29be5c8b-b62f-485c-998f-043fe218176b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['4c513797-4919-4e80-9d08-c2b88dcc61a1']#033[00m Feb 1 04:59:07 localhost podman[325012]: Feb 1 04:59:07 localhost podman[325012]: 2026-02-01 09:59:07.99331778 +0000 UTC m=+0.067423405 container create a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 04:59:08 localhost systemd[1]: Started libpod-conmon-a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00.scope. Feb 1 04:59:08 localhost systemd[1]: Started libcrun container. Feb 1 04:59:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c730c8264ec213f8c227e4a1c20781aa973a62adc1fbbba4b816da8b859c8f18/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:59:08 localhost podman[325012]: 2026-02-01 09:59:08.055794322 +0000 UTC m=+0.129899917 container init a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:08 localhost podman[325012]: 2026-02-01 09:59:08.063669214 +0000 UTC m=+0.137774809 container start a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:59:08 localhost podman[325012]: 2026-02-01 09:59:07.969315382 +0000 UTC m=+0.043420977 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:59:08 localhost dnsmasq[325030]: started, version 2.85 cachesize 150 Feb 1 04:59:08 localhost dnsmasq[325030]: DNS service limited to local subnets Feb 1 04:59:08 localhost dnsmasq[325030]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:59:08 localhost dnsmasq[325030]: warning: no upstream servers configured Feb 1 04:59:08 localhost dnsmasq-dhcp[325030]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:59:08 localhost dnsmasq-dhcp[325030]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:59:08 localhost dnsmasq[325030]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 0 addresses Feb 1 04:59:08 localhost dnsmasq-dhcp[325030]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:08 localhost dnsmasq-dhcp[325030]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:08.109 259320 INFO neutron.agent.dhcp.agent [None req-223400bd-25c3-43ef-9ec3-4dd13133f93f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:59:06Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=781121cb-4b87-40bb-a162-d1ea2c06622d, ip_allocation=immediate, mac_address=fa:16:3e:7f:11:c5, name=tempest-PortsIpV6TestJSON-2139989431, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:10Z, description=, dns_domain=, id=8236cd55-f7de-4bbb-a1c6-63cb66897e5f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-test-network-407151453, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10540, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=2390, status=ACTIVE, subnets=['d6e0633c-6ef9-497b-a209-f204e52773e1', 'ed076557-8372-4246-860d-d2a0ac16af93'], tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:59:04Z, vlan_transparent=None, network_id=8236cd55-f7de-4bbb-a1c6-63cb66897e5f, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['4c513797-4919-4e80-9d08-c2b88dcc61a1'], standard_attr_id=2727, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:59:07Z on network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f#033[00m Feb 1 04:59:08 localhost dnsmasq[325030]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 1 addresses Feb 1 04:59:08 localhost dnsmasq-dhcp[325030]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:08 localhost dnsmasq-dhcp[325030]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:08 localhost podman[325047]: 2026-02-01 09:59:08.308111162 +0000 UTC m=+0.060013497 container kill a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:59:08 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:08.389 2 INFO neutron.agent.securitygroups_rpc [None req-70d5b65b-a947-49dc-a1b2-2f95b637bb85 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['a0337e8b-7eb2-4444-b9bf-a19f28129233']#033[00m Feb 1 04:59:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:08.410 259320 INFO neutron.agent.dhcp.agent [None req-434a5090-0eab-42f7-bedf-c175dce14920 - - - - - -] DHCP configuration for ports {'e5c15d2f-6a4e-437f-9a05-5dbbb79b3cd9', 'c7c66182-49e6-4bde-bbaf-7f80055fee26', 'eae9c3d8-c5cc-4401-abe4-334f5c801488'} is completed#033[00m Feb 1 04:59:08 localhost ovn_controller[152492]: 2026-02-01T09:59:08Z|00458|binding|INFO|Removing iface tap40a6b4fe-ca ovn-installed in OVS Feb 1 04:59:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:08.469 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c1bd0805-a9fe-488f-8bb1-72cad3ae452d with type ""#033[00m Feb 1 04:59:08 localhost ovn_controller[152492]: 2026-02-01T09:59:08Z|00459|binding|INFO|Removing lport 40a6b4fe-ca01-4182-9750-1e2bb438a1ea ovn-installed in OVS Feb 1 04:59:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:08.471 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-de7f0928-15ba-48be-888b-30e3dafdaa75', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de7f0928-15ba-48be-888b-30e3dafdaa75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bdc37ab-2923-430c-80f3-a25c674895bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=40a6b4fe-ca01-4182-9750-1e2bb438a1ea) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:08 localhost nova_compute[274651]: 2026-02-01 09:59:08.471 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:08.473 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 40a6b4fe-ca01-4182-9750-1e2bb438a1ea in datapath de7f0928-15ba-48be-888b-30e3dafdaa75 unbound from our chassis#033[00m Feb 1 04:59:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:08.475 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network de7f0928-15ba-48be-888b-30e3dafdaa75, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:59:08 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:08.476 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[7efeb6c3-a864-44ef-ad89-466e1011638a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:59:08 localhost nova_compute[274651]: 2026-02-01 09:59:08.478 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:08 localhost nova_compute[274651]: 2026-02-01 09:59:08.485 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:08 localhost kernel: device tap40a6b4fe-ca left promiscuous mode Feb 1 04:59:08 localhost nova_compute[274651]: 2026-02-01 09:59:08.503 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:08 localhost dnsmasq[324153]: exiting on receipt of SIGTERM Feb 1 04:59:08 localhost podman[325082]: 2026-02-01 09:59:08.523834087 +0000 UTC m=+0.049462922 container kill b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:59:08 localhost systemd[1]: libpod-b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045.scope: Deactivated successfully. Feb 1 04:59:08 localhost podman[325100]: 2026-02-01 09:59:08.584224734 +0000 UTC m=+0.051611988 container died b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 04:59:08 localhost podman[325100]: 2026-02-01 09:59:08.661577844 +0000 UTC m=+0.128965068 container cleanup b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:08 localhost systemd[1]: libpod-conmon-b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045.scope: Deactivated successfully. Feb 1 04:59:08 localhost podman[325107]: 2026-02-01 09:59:08.685387716 +0000 UTC m=+0.139727509 container remove b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c6425ceb-3ede-433f-8bed-334b148920ea, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 04:59:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:08.712 259320 INFO neutron.agent.dhcp.agent [None req-ecf63cf6-d1ae-4828-9700-7e392cdef7e9 - - - - - -] DHCP configuration for ports {'781121cb-4b87-40bb-a162-d1ea2c06622d'} is completed#033[00m Feb 1 04:59:08 localhost sshd[325126]: main: sshd: ssh-rsa algorithm is disabled Feb 1 04:59:08 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:08.723 259320 INFO neutron.agent.dhcp.agent [None req-488b8839-ae82-46bc-95b4-e0957be02692 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:08 localhost systemd[1]: tmp-crun.1TVY7Y.mount: Deactivated successfully. Feb 1 04:59:08 localhost systemd[1]: var-lib-containers-storage-overlay-897bc60d82534fc1bef13a6bb48ad6db324a274fdf3b55819f2feef382288ca2-merged.mount: Deactivated successfully. Feb 1 04:59:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6bec8162e52a4cc594cef0b8c112f61e4353dd8291649a3779e9cdab9fab045-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:08 localhost systemd[1]: run-netns-qdhcp\x2dc6425ceb\x2d3ede\x2d433f\x2d8bed\x2d334b148920ea.mount: Deactivated successfully. Feb 1 04:59:09 localhost dnsmasq[324968]: read /var/lib/neutron/dhcp/de7f0928-15ba-48be-888b-30e3dafdaa75/addn_hosts - 0 addresses Feb 1 04:59:09 localhost podman[325145]: 2026-02-01 09:59:09.16278355 +0000 UTC m=+0.058404988 container kill ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 04:59:09 localhost dnsmasq-dhcp[324968]: read /var/lib/neutron/dhcp/de7f0928-15ba-48be-888b-30e3dafdaa75/host Feb 1 04:59:09 localhost dnsmasq-dhcp[324968]: read /var/lib/neutron/dhcp/de7f0928-15ba-48be-888b-30e3dafdaa75/opts Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent [None req-1ee8809d-a085-4055-b823-a0eba0841b65 - - - - - -] Unable to reload_allocations dhcp for de7f0928-15ba-48be-888b-30e3dafdaa75.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap40a6b4fe-ca not found in namespace qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75. Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent return fut.result() Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent raise self._exception Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap40a6b4fe-ca not found in namespace qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75. Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.186 259320 ERROR neutron.agent.dhcp.agent #033[00m Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.190 259320 INFO neutron.agent.dhcp.agent [None req-efd7b8dc-b4a5-49e8-855b-0e9a0906d325 - - - - - -] Synchronizing state#033[00m Feb 1 04:59:09 localhost ovn_controller[152492]: 2026-02-01T09:59:09Z|00460|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:09 localhost nova_compute[274651]: 2026-02-01 09:59:09.324 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.428 259320 INFO neutron.agent.dhcp.agent [None req-f436111c-9e7f-4a54-8276-e154f0339086 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 1 04:59:09 localhost dnsmasq[324968]: exiting on receipt of SIGTERM Feb 1 04:59:09 localhost systemd[1]: libpod-ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde.scope: Deactivated successfully. Feb 1 04:59:09 localhost podman[325177]: 2026-02-01 09:59:09.600782651 +0000 UTC m=+0.059148640 container kill ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:09 localhost podman[325191]: 2026-02-01 09:59:09.684794165 +0000 UTC m=+0.061199614 container died ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 04:59:09 localhost podman[325191]: 2026-02-01 09:59:09.723942569 +0000 UTC m=+0.100348018 container cleanup ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 04:59:09 localhost systemd[1]: libpod-conmon-ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde.scope: Deactivated successfully. Feb 1 04:59:09 localhost systemd[1]: tmp-crun.cnG474.mount: Deactivated successfully. Feb 1 04:59:09 localhost systemd[1]: var-lib-containers-storage-overlay-8cdb98e0fc199f7d7648d1f6afac75f517d11b4e3b5202c181c31b442ccd66cc-merged.mount: Deactivated successfully. Feb 1 04:59:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:09 localhost podman[325193]: 2026-02-01 09:59:09.763060942 +0000 UTC m=+0.137957174 container remove ec0d18dd2149f4119a0573dc52ee73be2f3aba2a34d7b5eec9afcda04fb1cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-de7f0928-15ba-48be-888b-30e3dafdaa75, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 04:59:09 localhost systemd[1]: run-netns-qdhcp\x2dde7f0928\x2d15ba\x2d48be\x2d888b\x2d30e3dafdaa75.mount: Deactivated successfully. Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.882 259320 INFO neutron.agent.dhcp.agent [-] Starting network c6425ceb-3ede-433f-8bed-334b148920ea dhcp configuration#033[00m Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.884 259320 INFO neutron.agent.dhcp.agent [-] Finished network c6425ceb-3ede-433f-8bed-334b148920ea dhcp configuration#033[00m Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.885 259320 INFO neutron.agent.dhcp.agent [None req-cc350183-15a8-4b4c-8c29-99bc2e367d05 - - - - - -] Synchronizing state complete#033[00m Feb 1 04:59:09 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:09.886 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:10 localhost dnsmasq[325030]: exiting on receipt of SIGTERM Feb 1 04:59:10 localhost podman[325239]: 2026-02-01 09:59:10.423696472 +0000 UTC m=+0.060468152 container kill a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:10 localhost systemd[1]: libpod-a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00.scope: Deactivated successfully. Feb 1 04:59:10 localhost podman[325254]: 2026-02-01 09:59:10.508379946 +0000 UTC m=+0.062025939 container died a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:59:10 localhost podman[325254]: 2026-02-01 09:59:10.550091239 +0000 UTC m=+0.103737182 container remove a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 04:59:10 localhost systemd[1]: libpod-conmon-a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00.scope: Deactivated successfully. Feb 1 04:59:10 localhost systemd[1]: var-lib-containers-storage-overlay-c730c8264ec213f8c227e4a1c20781aa973a62adc1fbbba4b816da8b859c8f18-merged.mount: Deactivated successfully. Feb 1 04:59:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a47072d4c536fdf8e8a4b56374d9939e04699addd42568b4ed0e56582ccd9a00-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:10 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:10.769 2 INFO neutron.agent.securitygroups_rpc [None req-db14a021-48cb-407b-8edd-e94cee0b2d02 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['882cde13-f256-402b-ab30-a0fc50e38425', 'd59fe500-82e7-40fc-885e-589a886bd9ec', '4c513797-4919-4e80-9d08-c2b88dcc61a1']#033[00m Feb 1 04:59:10 localhost nova_compute[274651]: 2026-02-01 09:59:10.928 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:11 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:11.385 2 INFO neutron.agent.securitygroups_rpc [None req-3849eb70-2e22-4d07-ad6e-472e2f67b4ca 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['3d036e8e-d2c8-4e3a-9dbf-e906123b5f25']#033[00m Feb 1 04:59:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:59:11 localhost systemd[1]: tmp-crun.EZvZn8.mount: Deactivated successfully. Feb 1 04:59:11 localhost podman[325296]: 2026-02-01 09:59:11.748013474 +0000 UTC m=+0.097064867 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:59:11 localhost podman[325296]: 2026-02-01 09:59:11.756005899 +0000 UTC m=+0.105057342 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 04:59:11 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:59:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:12.162 2 INFO neutron.agent.securitygroups_rpc [None req-5ba6c6ad-3063-4fac-a109-b3e261c4ab28 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['3d036e8e-d2c8-4e3a-9dbf-e906123b5f25']#033[00m Feb 1 04:59:12 localhost podman[325353]: Feb 1 04:59:12 localhost podman[325353]: 2026-02-01 09:59:12.18940892 +0000 UTC m=+0.069297993 container create be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:12 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:12.191 2 INFO neutron.agent.securitygroups_rpc [None req-b3757230-46f4-432e-b101-9a3e15d9fc63 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['882cde13-f256-402b-ab30-a0fc50e38425', 'd59fe500-82e7-40fc-885e-589a886bd9ec']#033[00m Feb 1 04:59:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:59:12 localhost systemd[1]: Started libpod-conmon-be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f.scope. Feb 1 04:59:12 localhost systemd[1]: tmp-crun.C3njrp.mount: Deactivated successfully. Feb 1 04:59:12 localhost systemd[1]: Started libcrun container. Feb 1 04:59:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1626cf3dfb4ca072bfe668f486989dad695f43be1d1fe4ac985a8ec8e4d34965/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:59:12 localhost podman[325353]: 2026-02-01 09:59:12.157930312 +0000 UTC m=+0.037819355 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:59:12 localhost podman[325353]: 2026-02-01 09:59:12.264488669 +0000 UTC m=+0.144377782 container init be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:12 localhost dnsmasq[325383]: started, version 2.85 cachesize 150 Feb 1 04:59:12 localhost dnsmasq[325383]: DNS service limited to local subnets Feb 1 04:59:12 localhost dnsmasq[325383]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:59:12 localhost dnsmasq[325383]: warning: no upstream servers configured Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d Feb 1 04:59:12 localhost dnsmasq[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 1 addresses Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:12 localhost podman[325367]: 2026-02-01 09:59:12.30840312 +0000 UTC m=+0.083300973 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:59:12 localhost podman[325353]: 2026-02-01 09:59:12.328021244 +0000 UTC m=+0.207910307 container start be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 04:59:12 localhost podman[325367]: 2026-02-01 09:59:12.342157978 +0000 UTC m=+0.117055861 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:12 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:59:12 localhost nova_compute[274651]: 2026-02-01 09:59:12.369 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:12.384 259320 INFO neutron.agent.dhcp.agent [None req-bfa6156b-726c-484a-85f4-f992756b2f7c - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:59:06Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=781121cb-4b87-40bb-a162-d1ea2c06622d, ip_allocation=immediate, mac_address=fa:16:3e:7f:11:c5, name=tempest-PortsIpV6TestJSON-666824369, network_id=8236cd55-f7de-4bbb-a1c6-63cb66897e5f, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['882cde13-f256-402b-ab30-a0fc50e38425', 'd59fe500-82e7-40fc-885e-589a886bd9ec'], standard_attr_id=2727, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:59:10Z on network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f#033[00m Feb 1 04:59:12 localhost dnsmasq[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 1 addresses Feb 1 04:59:12 localhost podman[325412]: 2026-02-01 09:59:12.570241733 +0000 UTC m=+0.056852769 container kill be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:12.591 259320 INFO neutron.agent.dhcp.agent [None req-e28da5b2-94c7-4e9c-9af0-a11bb70b7733 - - - - - -] DHCP configuration for ports {'e5c15d2f-6a4e-437f-9a05-5dbbb79b3cd9', 'c7c66182-49e6-4bde-bbaf-7f80055fee26', '781121cb-4b87-40bb-a162-d1ea2c06622d', 'eae9c3d8-c5cc-4401-abe4-334f5c801488'} is completed#033[00m Feb 1 04:59:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:12.807 259320 INFO neutron.agent.dhcp.agent [None req-ec958593-d161-4c9f-aca5-07d6129180fe - - - - - -] DHCP configuration for ports {'781121cb-4b87-40bb-a162-d1ea2c06622d'} is completed#033[00m Feb 1 04:59:12 localhost podman[325449]: 2026-02-01 09:59:12.849040209 +0000 UTC m=+0.042623143 container kill be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:59:12 localhost dnsmasq[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 0 addresses Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:12 localhost dnsmasq-dhcp[325383]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:14 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:14.494 2 INFO neutron.agent.securitygroups_rpc [None req-5e11ff1d-8674-48ea-81ee-b80b81afee00 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['7c2cef09-1439-45eb-af5a-e316fd7a5ca9']#033[00m Feb 1 04:59:14 localhost ovn_controller[152492]: 2026-02-01T09:59:14Z|00461|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:14 localhost nova_compute[274651]: 2026-02-01 09:59:14.700 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:14 localhost dnsmasq[325383]: exiting on receipt of SIGTERM Feb 1 04:59:14 localhost podman[325486]: 2026-02-01 09:59:14.767814144 +0000 UTC m=+0.049555854 container kill be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:59:14 localhost systemd[1]: libpod-be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f.scope: Deactivated successfully. Feb 1 04:59:14 localhost podman[325505]: 2026-02-01 09:59:14.828872802 +0000 UTC m=+0.040816116 container died be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 04:59:14 localhost systemd[1]: var-lib-containers-storage-overlay-1626cf3dfb4ca072bfe668f486989dad695f43be1d1fe4ac985a8ec8e4d34965-merged.mount: Deactivated successfully. Feb 1 04:59:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:14 localhost podman[325505]: 2026-02-01 09:59:14.909542844 +0000 UTC m=+0.121486128 container remove be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 04:59:14 localhost systemd[1]: libpod-conmon-be6cabb0f2772d4e4035bc7626a4b52e993798e17e3b33a6be668ff88137227f.scope: Deactivated successfully. Feb 1 04:59:15 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:15.115 2 INFO neutron.agent.securitygroups_rpc [None req-d9b77fb6-ebc7-41c7-a5b0-8d63738e2d77 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['7c2cef09-1439-45eb-af5a-e316fd7a5ca9']#033[00m Feb 1 04:59:15 localhost nova_compute[274651]: 2026-02-01 09:59:15.350 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:15 localhost podman[325574]: Feb 1 04:59:15 localhost podman[325574]: 2026-02-01 09:59:15.683608811 +0000 UTC m=+0.094031442 container create bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:59:15 localhost systemd[1]: Started libpod-conmon-bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124.scope. Feb 1 04:59:15 localhost systemd[1]: Started libcrun container. Feb 1 04:59:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2db2f3668ba5eee837832d38abffbdc47abfe0fe7b74e7c99b15b527b51ee916/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:59:15 localhost podman[325574]: 2026-02-01 09:59:15.738658285 +0000 UTC m=+0.149080916 container init bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2) Feb 1 04:59:15 localhost podman[325574]: 2026-02-01 09:59:15.640179476 +0000 UTC m=+0.050602177 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:59:15 localhost podman[325574]: 2026-02-01 09:59:15.747710233 +0000 UTC m=+0.158132884 container start bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 1 04:59:15 localhost dnsmasq[325592]: started, version 2.85 cachesize 150 Feb 1 04:59:15 localhost dnsmasq[325592]: DNS service limited to local subnets Feb 1 04:59:15 localhost dnsmasq[325592]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:59:15 localhost dnsmasq[325592]: warning: no upstream servers configured Feb 1 04:59:15 localhost dnsmasq-dhcp[325592]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 1 04:59:15 localhost dnsmasq-dhcp[325592]: DHCPv6, static leases only on 2001:db8:0:2::, lease time 1d Feb 1 04:59:15 localhost dnsmasq[325592]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/addn_hosts - 0 addresses Feb 1 04:59:15 localhost dnsmasq-dhcp[325592]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/host Feb 1 04:59:15 localhost dnsmasq-dhcp[325592]: read /var/lib/neutron/dhcp/8236cd55-f7de-4bbb-a1c6-63cb66897e5f/opts Feb 1 04:59:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e206 do_prune osdmap full prune enabled Feb 1 04:59:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e207 e207: 6 total, 6 up, 6 in Feb 1 04:59:15 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e207: 6 total, 6 up, 6 in Feb 1 04:59:15 localhost nova_compute[274651]: 2026-02-01 09:59:15.974 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:15 localhost nova_compute[274651]: 2026-02-01 09:59:15.978 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:16 localhost dnsmasq[325592]: exiting on receipt of SIGTERM Feb 1 04:59:16 localhost podman[325610]: 2026-02-01 09:59:16.100614038 +0000 UTC m=+0.061123192 container kill bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 04:59:16 localhost systemd[1]: libpod-bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124.scope: Deactivated successfully. Feb 1 04:59:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:16.110 259320 INFO neutron.agent.dhcp.agent [None req-90f351c8-9c96-481c-a03a-d0586a245013 - - - - - -] DHCP configuration for ports {'e5c15d2f-6a4e-437f-9a05-5dbbb79b3cd9', 'c7c66182-49e6-4bde-bbaf-7f80055fee26', 'eae9c3d8-c5cc-4401-abe4-334f5c801488'} is completed#033[00m Feb 1 04:59:16 localhost podman[325623]: 2026-02-01 09:59:16.175379467 +0000 UTC m=+0.061297906 container died bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS) Feb 1 04:59:16 localhost podman[325623]: 2026-02-01 09:59:16.212926662 +0000 UTC m=+0.098845031 container cleanup bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 04:59:16 localhost systemd[1]: libpod-conmon-bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124.scope: Deactivated successfully. Feb 1 04:59:16 localhost podman[325625]: 2026-02-01 09:59:16.260606228 +0000 UTC m=+0.137173230 container remove bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8236cd55-f7de-4bbb-a1c6-63cb66897e5f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Feb 1 04:59:16 localhost nova_compute[274651]: 2026-02-01 09:59:16.657 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:16 localhost ovn_controller[152492]: 2026-02-01T09:59:16Z|00462|binding|INFO|Releasing lport c7c66182-49e6-4bde-bbaf-7f80055fee26 from this chassis (sb_readonly=0) Feb 1 04:59:16 localhost ovn_controller[152492]: 2026-02-01T09:59:16Z|00463|binding|INFO|Setting lport c7c66182-49e6-4bde-bbaf-7f80055fee26 down in Southbound Feb 1 04:59:16 localhost kernel: device tapc7c66182-49 left promiscuous mode Feb 1 04:59:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:16.664 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8:0:2::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-8236cd55-f7de-4bbb-a1c6-63cb66897e5f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8236cd55-f7de-4bbb-a1c6-63cb66897e5f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51441a28-7c57-4753-9c09-9a33cf92d6c8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c7c66182-49e6-4bde-bbaf-7f80055fee26) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:16.666 158365 INFO neutron.agent.ovn.metadata.agent [-] Port c7c66182-49e6-4bde-bbaf-7f80055fee26 in datapath 8236cd55-f7de-4bbb-a1c6-63cb66897e5f unbound from our chassis#033[00m Feb 1 04:59:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:16.668 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8236cd55-f7de-4bbb-a1c6-63cb66897e5f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 04:59:16 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:16.669 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[89a2ce72-beb9-4d7c-8c0e-299cf4ad74bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:59:16 localhost nova_compute[274651]: 2026-02-01 09:59:16.684 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:16 localhost systemd[1]: tmp-crun.UqHCY9.mount: Deactivated successfully. Feb 1 04:59:16 localhost systemd[1]: var-lib-containers-storage-overlay-2db2f3668ba5eee837832d38abffbdc47abfe0fe7b74e7c99b15b527b51ee916-merged.mount: Deactivated successfully. Feb 1 04:59:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bdcb8f4033b8a3a71daf1772bea5716680c09c9fce88f2cbb5ce58309a69c124-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:16.880 259320 INFO neutron.agent.dhcp.agent [None req-8fb38ce5-8ee2-40c1-ac41-6f4f73ff817b - - - - - -] DHCP configuration for ports {'e5c15d2f-6a4e-437f-9a05-5dbbb79b3cd9', 'c7c66182-49e6-4bde-bbaf-7f80055fee26', 'eae9c3d8-c5cc-4401-abe4-334f5c801488'} is completed#033[00m Feb 1 04:59:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e207 do_prune osdmap full prune enabled Feb 1 04:59:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e208 e208: 6 total, 6 up, 6 in Feb 1 04:59:16 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e208: 6 total, 6 up, 6 in Feb 1 04:59:16 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:16.967 259320 INFO neutron.agent.dhcp.agent [None req-57cadd87-1f72-4ace-9cc0-d3d222b163c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:16 localhost systemd[1]: run-netns-qdhcp\x2d8236cd55\x2df7de\x2d4bbb\x2da1c6\x2d63cb66897e5f.mount: Deactivated successfully. Feb 1 04:59:16 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:16.995 2 INFO neutron.agent.securitygroups_rpc [None req-7f0248f7-eb48-44f1-afa4-6c21da99fef7 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']#033[00m Feb 1 04:59:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:17.109 2 INFO neutron.agent.securitygroups_rpc [None req-8aeaa6d8-5ca7-4c3a-b8a2-e5b99f7af336 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']#033[00m Feb 1 04:59:17 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:17.132 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:17.314 2 INFO neutron.agent.securitygroups_rpc [None req-bfce6195-4572-4858-b958-8ae0eaa1dc23 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']#033[00m Feb 1 04:59:17 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:17.535 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:17 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:17.773 2 INFO neutron.agent.securitygroups_rpc [None req-a584f8be-9a76-404e-9854-afe457f9fc2e 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']#033[00m Feb 1 04:59:17 localhost ovn_controller[152492]: 2026-02-01T09:59:17Z|00464|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:17 localhost nova_compute[274651]: 2026-02-01 09:59:17.955 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:18 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:18.195 2 INFO neutron.agent.securitygroups_rpc [None req-e84e7596-d223-4f47-8254-074f89f65fdc 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']#033[00m Feb 1 04:59:18 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:18.527 2 INFO neutron.agent.securitygroups_rpc [None req-e742ba15-30cb-436e-a4c2-2391ce2e0670 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']#033[00m Feb 1 04:59:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:59:18 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2510115252' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:59:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:59:18 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2510115252' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:59:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:59:18 localhost podman[325664]: 2026-02-01 09:59:18.727975918 +0000 UTC m=+0.088451621 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 04:59:18 localhost podman[325664]: 2026-02-01 09:59:18.740181913 +0000 UTC m=+0.100657636 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, architecture=x86_64, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1769056855, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 1 04:59:18 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:59:18 localhost dnsmasq[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/addn_hosts - 0 addresses Feb 1 04:59:18 localhost podman[325683]: 2026-02-01 09:59:18.81485448 +0000 UTC m=+0.112812430 container kill d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:59:18 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/host Feb 1 04:59:18 localhost dnsmasq-dhcp[323026]: read /var/lib/neutron/dhcp/ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a/opts Feb 1 04:59:18 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:18.881 2 INFO neutron.agent.securitygroups_rpc [None req-46338e73-87bf-491f-8ed0-9b7015f713f3 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']#033[00m Feb 1 04:59:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e208 do_prune osdmap full prune enabled Feb 1 04:59:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e209 e209: 6 total, 6 up, 6 in Feb 1 04:59:18 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e209: 6 total, 6 up, 6 in Feb 1 04:59:19 localhost ovn_controller[152492]: 2026-02-01T09:59:19Z|00465|binding|INFO|Releasing lport 467a2044-1273-4e6f-a459-60513fa8b688 from this chassis (sb_readonly=0) Feb 1 04:59:19 localhost kernel: device tap467a2044-12 left promiscuous mode Feb 1 04:59:19 localhost ovn_controller[152492]: 2026-02-01T09:59:19Z|00466|binding|INFO|Setting lport 467a2044-1273-4e6f-a459-60513fa8b688 down in Southbound Feb 1 04:59:19 localhost nova_compute[274651]: 2026-02-01 09:59:19.060 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:19.070 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=69f0b901-aef5-455a-bf56-ef7db3227583, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=467a2044-1273-4e6f-a459-60513fa8b688) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:19.072 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 467a2044-1273-4e6f-a459-60513fa8b688 in datapath ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a unbound from our chassis#033[00m Feb 1 04:59:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:19.076 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:59:19 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:19.077 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[303f10a1-7a92-4d4d-ae1e-8b75182d00d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:59:19 localhost nova_compute[274651]: 2026-02-01 09:59:19.085 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:19 localhost systemd[1]: tmp-crun.4nyooW.mount: Deactivated successfully. Feb 1 04:59:19 localhost dnsmasq[323026]: exiting on receipt of SIGTERM Feb 1 04:59:19 localhost podman[325730]: 2026-02-01 09:59:19.415446873 +0000 UTC m=+0.070658635 container kill d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:59:19 localhost systemd[1]: libpod-d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59.scope: Deactivated successfully. Feb 1 04:59:19 localhost podman[325743]: 2026-02-01 09:59:19.469576898 +0000 UTC m=+0.038976240 container died d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127) Feb 1 04:59:19 localhost podman[325743]: 2026-02-01 09:59:19.547375381 +0000 UTC m=+0.116774723 container cleanup d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 04:59:19 localhost systemd[1]: libpod-conmon-d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59.scope: Deactivated successfully. Feb 1 04:59:19 localhost podman[325744]: 2026-02-01 09:59:19.576206958 +0000 UTC m=+0.140620916 container remove d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ade0a3ff-dcbd-41c6-8bc1-44d6c1f21e9a, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:59:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:19.614 259320 INFO neutron.agent.dhcp.agent [None req-249a7e26-e9c4-4f39-b8d8-cd411c7356d9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:19 localhost systemd[1]: var-lib-containers-storage-overlay-744ff171323d0a729cd92183d43c5c80ae24fc9b8f7475f722dd94deff9f45e5-merged.mount: Deactivated successfully. Feb 1 04:59:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7930234a245d5ef345f2b06060b5834162ed9ae248bde1bfe2178c4685f0e59-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:19 localhost systemd[1]: run-netns-qdhcp\x2dade0a3ff\x2ddcbd\x2d41c6\x2d8bc1\x2d44d6c1f21e9a.mount: Deactivated successfully. Feb 1 04:59:19 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:19.736 2 INFO neutron.agent.securitygroups_rpc [None req-7124cdef-fc68-40e6-a617-ec7afce96cf9 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['05a59877-b29c-4804-965e-2274924179d2']#033[00m Feb 1 04:59:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:19.789 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:19 localhost ovn_controller[152492]: 2026-02-01T09:59:19Z|00467|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e209 do_prune osdmap full prune enabled Feb 1 04:59:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e210 e210: 6 total, 6 up, 6 in Feb 1 04:59:20 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e210: 6 total, 6 up, 6 in Feb 1 04:59:20 localhost nova_compute[274651]: 2026-02-01 09:59:20.026 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e210 do_prune osdmap full prune enabled Feb 1 04:59:21 localhost nova_compute[274651]: 2026-02-01 09:59:21.024 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e211 e211: 6 total, 6 up, 6 in Feb 1 04:59:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e211: 6 total, 6 up, 6 in Feb 1 04:59:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e211 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e211 do_prune osdmap full prune enabled Feb 1 04:59:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e212 e212: 6 total, 6 up, 6 in Feb 1 04:59:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e212: 6 total, 6 up, 6 in Feb 1 04:59:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:59:21 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:59:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:59:23 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2418254129' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:59:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:59:23 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2418254129' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:59:23 localhost podman[236886]: time="2026-02-01T09:59:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:59:23 localhost podman[236886]: @ - - [01/Feb/2026:09:59:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:59:24 localhost podman[236886]: @ - - [01/Feb/2026:09:59:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18838 "" "Go-http-client/1.1" Feb 1 04:59:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} v 0) Feb 1 04:59:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished Feb 1 04:59:26 localhost nova_compute[274651]: 2026-02-01 09:59:26.026 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:59:26 localhost nova_compute[274651]: 2026-02-01 09:59:26.028 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:59:26 localhost nova_compute[274651]: 2026-02-01 09:59:26.028 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:59:26 localhost nova_compute[274651]: 2026-02-01 09:59:26.029 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:59:26 localhost nova_compute[274651]: 2026-02-01 09:59:26.064 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:26 localhost nova_compute[274651]: 2026-02-01 09:59:26.065 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:59:26 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Feb 1 04:59:26 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:26 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:26 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished Feb 1 04:59:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:59:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:59:26 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:59:26 localhost podman[325771]: 2026-02-01 09:59:26.730230616 +0000 UTC m=+0.084712837 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 1 04:59:26 localhost podman[325771]: 2026-02-01 09:59:26.765934003 +0000 UTC m=+0.120416224 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 04:59:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e212 do_prune osdmap full prune enabled Feb 1 04:59:26 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:59:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e213 e213: 6 total, 6 up, 6 in Feb 1 04:59:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e213: 6 total, 6 up, 6 in Feb 1 04:59:27 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:27.929 2 INFO neutron.agent.securitygroups_rpc [None req-1f1f07c2-7f26-4f08-b340-54cc58b1fa0f ce67f2e1bfb142d8acccf95caf1fd7af ab1e856df66342919053583b6afafe11 - - default default] Security group member updated ['0924a62e-9fd4-48bb-ad08-68af324d32a1']#033[00m Feb 1 04:59:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e213 do_prune osdmap full prune enabled Feb 1 04:59:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e214 e214: 6 total, 6 up, 6 in Feb 1 04:59:28 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e214: 6 total, 6 up, 6 in Feb 1 04:59:28 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:28.469 259320 INFO neutron.agent.linux.ip_lib [None req-bd79300c-ab86-47f8-b44e-085c1f83cd24 - - - - - -] Device tap28213aa4-c0 cannot be used as it has no MAC address#033[00m Feb 1 04:59:28 localhost nova_compute[274651]: 2026-02-01 09:59:28.519 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:28 localhost kernel: device tap28213aa4-c0 entered promiscuous mode Feb 1 04:59:28 localhost NetworkManager[5964]: [1769939968.5260] manager: (tap28213aa4-c0): new Generic device (/org/freedesktop/NetworkManager/Devices/78) Feb 1 04:59:28 localhost ovn_controller[152492]: 2026-02-01T09:59:28Z|00468|binding|INFO|Claiming lport 28213aa4-c008-44c2-bb23-4233d3d92094 for this chassis. Feb 1 04:59:28 localhost ovn_controller[152492]: 2026-02-01T09:59:28Z|00469|binding|INFO|28213aa4-c008-44c2-bb23-4233d3d92094: Claiming unknown Feb 1 04:59:28 localhost nova_compute[274651]: 2026-02-01 09:59:28.527 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:28 localhost systemd-udevd[325801]: Network interface NamePolicy= disabled on kernel command line. Feb 1 04:59:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:28.542 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-faf87d87-32a3-4158-8b80-fc59d3eb8178', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-faf87d87-32a3-4158-8b80-fc59d3eb8178', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab1e856df66342919053583b6afafe11', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eca083c8-89ef-4622-95e8-a4b5e9b71100, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=28213aa4-c008-44c2-bb23-4233d3d92094) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:28.543 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 28213aa4-c008-44c2-bb23-4233d3d92094 in datapath faf87d87-32a3-4158-8b80-fc59d3eb8178 bound to our chassis#033[00m Feb 1 04:59:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:28.545 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9487faef-99c3-4494-bb26-fe37f40d3cc1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 04:59:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:28.545 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network faf87d87-32a3-4158-8b80-fc59d3eb8178, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:59:28 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:28.545 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[05e94092-287b-4a43-8adf-36bcf514bf69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:59:28 localhost nova_compute[274651]: 2026-02-01 09:59:28.565 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:28 localhost ovn_controller[152492]: 2026-02-01T09:59:28Z|00470|binding|INFO|Setting lport 28213aa4-c008-44c2-bb23-4233d3d92094 ovn-installed in OVS Feb 1 04:59:28 localhost ovn_controller[152492]: 2026-02-01T09:59:28Z|00471|binding|INFO|Setting lport 28213aa4-c008-44c2-bb23-4233d3d92094 up in Southbound Feb 1 04:59:28 localhost nova_compute[274651]: 2026-02-01 09:59:28.569 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:28 localhost nova_compute[274651]: 2026-02-01 09:59:28.598 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:28 localhost nova_compute[274651]: 2026-02-01 09:59:28.621 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} v 0) Feb 1 04:59:28 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:28 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished Feb 1 04:59:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:59:29 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/104439940' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:59:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:59:29 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/104439940' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:59:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e214 do_prune osdmap full prune enabled Feb 1 04:59:29 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Feb 1 04:59:29 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:29 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:29 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished Feb 1 04:59:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e215 e215: 6 total, 6 up, 6 in Feb 1 04:59:29 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e215: 6 total, 6 up, 6 in Feb 1 04:59:29 localhost podman[325856]: Feb 1 04:59:29 localhost podman[325856]: 2026-02-01 09:59:29.473551142 +0000 UTC m=+0.098194081 container create a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:59:29 localhost podman[325856]: 2026-02-01 09:59:29.430385514 +0000 UTC m=+0.055028483 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 04:59:29 localhost systemd[1]: Started libpod-conmon-a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0.scope. Feb 1 04:59:29 localhost systemd[1]: Started libcrun container. Feb 1 04:59:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a5ebe97c8de4f9a94e4c03ffd57d6532d63b1fdb0404ae2fd6eddf5db8184060/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 04:59:29 localhost podman[325856]: 2026-02-01 09:59:29.566430219 +0000 UTC m=+0.191073168 container init a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 1 04:59:29 localhost podman[325856]: 2026-02-01 09:59:29.578082617 +0000 UTC m=+0.202725566 container start a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 04:59:29 localhost dnsmasq[325874]: started, version 2.85 cachesize 150 Feb 1 04:59:29 localhost dnsmasq[325874]: DNS service limited to local subnets Feb 1 04:59:29 localhost dnsmasq[325874]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 04:59:29 localhost dnsmasq[325874]: warning: no upstream servers configured Feb 1 04:59:29 localhost dnsmasq-dhcp[325874]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 04:59:29 localhost dnsmasq[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/addn_hosts - 0 addresses Feb 1 04:59:29 localhost dnsmasq-dhcp[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/host Feb 1 04:59:29 localhost dnsmasq-dhcp[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/opts Feb 1 04:59:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:29.636 259320 INFO neutron.agent.dhcp.agent [None req-c3388403-5dd7-49a6-be4d-ec4de6bbe8d1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:59:27Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d6984cab-2ad6-4c2b-814b-f84bf25987ec, ip_allocation=immediate, mac_address=fa:16:3e:7b:62:67, name=tempest-TagsExtTest-63550693, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:59:26Z, description=, dns_domain=, id=faf87d87-32a3-4158-8b80-fc59d3eb8178, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-78461852, port_security_enabled=True, project_id=ab1e856df66342919053583b6afafe11, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62347, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2807, status=ACTIVE, subnets=['488332f7-f9c3-498f-95d2-2bf7c8a58569'], tags=[], tenant_id=ab1e856df66342919053583b6afafe11, updated_at=2026-02-01T09:59:26Z, vlan_transparent=None, network_id=faf87d87-32a3-4158-8b80-fc59d3eb8178, port_security_enabled=True, project_id=ab1e856df66342919053583b6afafe11, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['0924a62e-9fd4-48bb-ad08-68af324d32a1'], standard_attr_id=2817, status=DOWN, tags=[], tenant_id=ab1e856df66342919053583b6afafe11, updated_at=2026-02-01T09:59:27Z on network faf87d87-32a3-4158-8b80-fc59d3eb8178#033[00m Feb 1 04:59:29 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:29.769 259320 INFO neutron.agent.dhcp.agent [None req-c35a41b9-3f68-4104-9796-8aa179cb4b00 - - - - - -] DHCP configuration for ports {'40e7f94d-90fa-467f-b76c-16ec3f1660a6'} is completed#033[00m Feb 1 04:59:29 localhost dnsmasq[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/addn_hosts - 1 addresses Feb 1 04:59:29 localhost dnsmasq-dhcp[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/host Feb 1 04:59:29 localhost podman[325892]: 2026-02-01 09:59:29.853070115 +0000 UTC m=+0.060827851 container kill a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:59:29 localhost dnsmasq-dhcp[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/opts Feb 1 04:59:30 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:30.126 259320 INFO neutron.agent.dhcp.agent [None req-c32ed6bb-920f-4dd6-8e6f-90919bf05f36 - - - - - -] DHCP configuration for ports {'d6984cab-2ad6-4c2b-814b-f84bf25987ec'} is completed#033[00m Feb 1 04:59:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e215 do_prune osdmap full prune enabled Feb 1 04:59:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e216 e216: 6 total, 6 up, 6 in Feb 1 04:59:30 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e216: 6 total, 6 up, 6 in Feb 1 04:59:31 localhost nova_compute[274651]: 2026-02-01 09:59:31.096 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e216 do_prune osdmap full prune enabled Feb 1 04:59:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e217 e217: 6 total, 6 up, 6 in Feb 1 04:59:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e217: 6 total, 6 up, 6 in Feb 1 04:59:31 localhost openstack_network_exporter[239441]: ERROR 09:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 04:59:31 localhost openstack_network_exporter[239441]: Feb 1 04:59:31 localhost openstack_network_exporter[239441]: ERROR 09:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 04:59:31 localhost openstack_network_exporter[239441]: Feb 1 04:59:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e217 do_prune osdmap full prune enabled Feb 1 04:59:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e218 e218: 6 total, 6 up, 6 in Feb 1 04:59:32 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e218: 6 total, 6 up, 6 in Feb 1 04:59:32 localhost ovn_controller[152492]: 2026-02-01T09:59:32Z|00472|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0) Feb 1 04:59:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Feb 1 04:59:32 localhost nova_compute[274651]: 2026-02-01 09:59:32.911 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Feb 1 04:59:33 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=eve48,client_metadata.root=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b],prefix=session evict} (starting...) Feb 1 04:59:33 localhost neutron_sriov_agent[252126]: 2026-02-01 09:59:33.144 2 INFO neutron.agent.securitygroups_rpc [None req-842f7c09-bc9b-4f83-bb5e-50b631488f24 ce67f2e1bfb142d8acccf95caf1fd7af ab1e856df66342919053583b6afafe11 - - default default] Security group member updated ['0924a62e-9fd4-48bb-ad08-68af324d32a1']#033[00m Feb 1 04:59:33 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Feb 1 04:59:33 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Feb 1 04:59:33 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Feb 1 04:59:33 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Feb 1 04:59:33 localhost dnsmasq[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/addn_hosts - 0 addresses Feb 1 04:59:33 localhost dnsmasq-dhcp[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/host Feb 1 04:59:33 localhost podman[325929]: 2026-02-01 09:59:33.381125868 +0000 UTC m=+0.066303060 container kill a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:59:33 localhost dnsmasq-dhcp[325874]: read /var/lib/neutron/dhcp/faf87d87-32a3-4158-8b80-fc59d3eb8178/opts Feb 1 04:59:33 localhost dnsmasq[325874]: exiting on receipt of SIGTERM Feb 1 04:59:33 localhost podman[325968]: 2026-02-01 09:59:33.831192241 +0000 UTC m=+0.059726318 container kill a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 04:59:33 localhost systemd[1]: libpod-a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0.scope: Deactivated successfully. Feb 1 04:59:33 localhost podman[325983]: 2026-02-01 09:59:33.912719129 +0000 UTC m=+0.060027098 container died a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 04:59:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0-userdata-shm.mount: Deactivated successfully. Feb 1 04:59:33 localhost systemd[1]: var-lib-containers-storage-overlay-a5ebe97c8de4f9a94e4c03ffd57d6532d63b1fdb0404ae2fd6eddf5db8184060-merged.mount: Deactivated successfully. Feb 1 04:59:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:33.966 158365 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 9487faef-99c3-4494-bb26-fe37f40d3cc1 with type ""#033[00m Feb 1 04:59:33 localhost ovn_controller[152492]: 2026-02-01T09:59:33Z|00473|binding|INFO|Removing iface tap28213aa4-c0 ovn-installed in OVS Feb 1 04:59:33 localhost ovn_controller[152492]: 2026-02-01T09:59:33Z|00474|binding|INFO|Removing lport 28213aa4-c008-44c2-bb23-4233d3d92094 ovn-installed in OVS Feb 1 04:59:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:33.968 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-faf87d87-32a3-4158-8b80-fc59d3eb8178', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-faf87d87-32a3-4158-8b80-fc59d3eb8178', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ab1e856df66342919053583b6afafe11', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eca083c8-89ef-4622-95e8-a4b5e9b71100, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=28213aa4-c008-44c2-bb23-4233d3d92094) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:33 localhost nova_compute[274651]: 2026-02-01 09:59:33.969 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:33.972 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 28213aa4-c008-44c2-bb23-4233d3d92094 in datapath faf87d87-32a3-4158-8b80-fc59d3eb8178 unbound from our chassis#033[00m Feb 1 04:59:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:33.974 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network faf87d87-32a3-4158-8b80-fc59d3eb8178, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 04:59:33 localhost nova_compute[274651]: 2026-02-01 09:59:33.976 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:33 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:33.976 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[c7fd4952-1840-47f5-9e8e-591a72f0828d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 04:59:34 localhost podman[325983]: 2026-02-01 09:59:34.011020332 +0000 UTC m=+0.158328231 container remove a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-faf87d87-32a3-4158-8b80-fc59d3eb8178, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 04:59:34 localhost systemd[1]: libpod-conmon-a33e41e5d2ee08e349f1a9d89a073f0fcae68eb3e82b3131d3d1b13607f16ac0.scope: Deactivated successfully. Feb 1 04:59:34 localhost kernel: device tap28213aa4-c0 left promiscuous mode Feb 1 04:59:34 localhost nova_compute[274651]: 2026-02-01 09:59:34.021 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:34 localhost nova_compute[274651]: 2026-02-01 09:59:34.032 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:34.064 259320 INFO neutron.agent.dhcp.agent [None req-8371950d-f54a-446b-88b2-33d0b64833a2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:34 localhost neutron_dhcp_agent[259316]: 2026-02-01 09:59:34.181 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 04:59:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e218 do_prune osdmap full prune enabled Feb 1 04:59:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e219 e219: 6 total, 6 up, 6 in Feb 1 04:59:34 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e219: 6 total, 6 up, 6 in Feb 1 04:59:34 localhost systemd[1]: run-netns-qdhcp\x2dfaf87d87\x2d32a3\x2d4158\x2d8b80\x2dfc59d3eb8178.mount: Deactivated successfully. Feb 1 04:59:34 localhost ovn_controller[152492]: 2026-02-01T09:59:34Z|00475|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:34 localhost nova_compute[274651]: 2026-02-01 09:59:34.446 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 04:59:35 localhost podman[326005]: 2026-02-01 09:59:35.733427808 +0000 UTC m=+0.091072002 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 04:59:35 localhost podman[326005]: 2026-02-01 09:59:35.743191519 +0000 UTC m=+0.100835693 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 04:59:35 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 04:59:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} v 0) Feb 1 04:59:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:36 localhost nova_compute[274651]: 2026-02-01 09:59:36.141 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished Feb 1 04:59:36 localhost ovn_controller[152492]: 2026-02-01T09:59:36Z|00476|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 04:59:36 localhost nova_compute[274651]: 2026-02-01 09:59:36.348 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:36 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Feb 1 04:59:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:36 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch Feb 1 04:59:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished Feb 1 04:59:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:59:36 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2908926970' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:59:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:59:36 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2908926970' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:59:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e219 do_prune osdmap full prune enabled Feb 1 04:59:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e220 e220: 6 total, 6 up, 6 in Feb 1 04:59:36 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e220: 6 total, 6 up, 6 in Feb 1 04:59:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 04:59:37 localhost podman[326028]: 2026-02-01 09:59:37.353833098 +0000 UTC m=+0.080724334 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 1 04:59:37 localhost podman[326028]: 2026-02-01 09:59:37.384648166 +0000 UTC m=+0.111539342 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent) Feb 1 04:59:37 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 04:59:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0) Feb 1 04:59:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Feb 1 04:59:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Feb 1 04:59:39 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Feb 1 04:59:39 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Feb 1 04:59:39 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Feb 1 04:59:39 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Feb 1 04:59:39 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=eve47,client_metadata.root=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b],prefix=session evict} (starting...) Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0. Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.929031) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58 Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939980929100, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2576, "num_deletes": 276, "total_data_size": 3658508, "memory_usage": 3867664, "flush_reason": "Manual Compaction"} Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939980947051, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3567095, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31449, "largest_seqno": 34023, "table_properties": {"data_size": 3556054, "index_size": 7098, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25594, "raw_average_key_size": 22, "raw_value_size": 3533067, "raw_average_value_size": 3074, "num_data_blocks": 301, "num_entries": 1149, "num_filter_entries": 1149, "num_deletions": 276, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939851, "oldest_key_time": 1769939851, "file_creation_time": 1769939980, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}} Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 18073 microseconds, and 9471 cpu microseconds. Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.947108) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3567095 bytes OK Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.947137) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.949505) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.949527) EVENT_LOG_v1 {"time_micros": 1769939980949521, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.949553) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3647244, prev total WAL file size 3647244, number of live WAL files 2. Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.950568) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3483KB)], [57(18MB)] Feb 1 04:59:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939980950611, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 23027822, "oldest_snapshot_seqno": -1} Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13242 keys, 21758658 bytes, temperature: kUnknown Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939981045158, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 21758658, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21681386, "index_size": 43059, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33157, "raw_key_size": 355072, "raw_average_key_size": 26, "raw_value_size": 21454128, "raw_average_value_size": 1620, "num_data_blocks": 1623, "num_entries": 13242, "num_filter_entries": 13242, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939980, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}} Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.045775) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 21758658 bytes Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.048360) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 242.7 rd, 229.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 18.6 +0.0 blob) out(20.8 +0.0 blob), read-write-amplify(12.6) write-amplify(6.1) OK, records in: 13801, records dropped: 559 output_compression: NoCompression Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.048388) EVENT_LOG_v1 {"time_micros": 1769939981048377, "job": 34, "event": "compaction_finished", "compaction_time_micros": 94901, "compaction_time_cpu_micros": 55314, "output_level": 6, "num_output_files": 1, "total_output_size": 21758658, "num_input_records": 13801, "num_output_records": 13242, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939981049380, "job": 34, "event": "table_file_deletion", "file_number": 59} Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939981051473, "job": 34, "event": "table_file_deletion", "file_number": 57} Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:40.950482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.051668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.051677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.051680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.051683) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:41 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:41.051686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:41 localhost nova_compute[274651]: 2026-02-01 09:59:41.176 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:41 localhost podman[326153]: 2026-02-01 09:59:41.322662718 +0000 UTC m=+0.101240846 container exec f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, build-date=2025-12-08T17:28:53Z, distribution-scope=public, release=1764794109, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 1 04:59:41 localhost podman[326153]: 2026-02-01 09:59:41.438497 +0000 UTC m=+0.217075138 container exec_died f7271fac7c7f1f14adf6250a3993cbedb59f586e79aa7b10cf68dde97c2833b4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604212, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True) Feb 1 04:59:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:41.721 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 04:59:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:41.722 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 04:59:41 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:41.723 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 04:59:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e220 do_prune osdmap full prune enabled Feb 1 04:59:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e221 e221: 6 total, 6 up, 6 in Feb 1 04:59:41 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e221: 6 total, 6 up, 6 in Feb 1 04:59:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 04:59:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 04:59:41 localhost podman[326238]: 2026-02-01 09:59:41.93592337 +0000 UTC m=+0.097847720 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 04:59:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 04:59:41 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:41 localhost podman[326238]: 2026-02-01 09:59:41.968416419 +0000 UTC m=+0.130340789 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 04:59:41 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 04:59:42 localhost systemd[1]: tmp-crun.kz1Nky.mount: Deactivated successfully. Feb 1 04:59:42 localhost podman[326345]: 2026-02-01 09:59:42.740085984 +0000 UTC m=+0.095817009 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 04:59:42 localhost podman[326345]: 2026-02-01 09:59:42.779752194 +0000 UTC m=+0.135483229 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 1 04:59:42 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 04:59:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 1 04:59:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 04:59:43 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M Feb 1 04:59:43 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M Feb 1 04:59:43 localhost ceph-mon[286721]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M Feb 1 04:59:43 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:59:43 localhost ceph-mon[286721]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 04:59:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:59:43 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3868590087' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:59:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:59:44 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3868590087' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:59:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0) Feb 1 04:59:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Feb 1 04:59:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Feb 1 04:59:44 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=eve49,client_metadata.root=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b],prefix=session evict} (starting...) Feb 1 04:59:44 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Feb 1 04:59:44 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Feb 1 04:59:44 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Feb 1 04:59:44 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Feb 1 04:59:46 localhost nova_compute[274651]: 2026-02-01 09:59:46.179 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:59:46 localhost nova_compute[274651]: 2026-02-01 09:59:46.181 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:59:46 localhost nova_compute[274651]: 2026-02-01 09:59:46.181 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:59:46 localhost nova_compute[274651]: 2026-02-01 09:59:46.181 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:59:46 localhost nova_compute[274651]: 2026-02-01 09:59:46.215 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:46 localhost nova_compute[274651]: 2026-02-01 09:59:46.216 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:59:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e221 do_prune osdmap full prune enabled Feb 1 04:59:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e222 e222: 6 total, 6 up, 6 in Feb 1 04:59:46 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e222: 6 total, 6 up, 6 in Feb 1 04:59:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 04:59:46 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e222 do_prune osdmap full prune enabled Feb 1 04:59:47 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 04:59:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e223 e223: 6 total, 6 up, 6 in Feb 1 04:59:47 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e223: 6 total, 6 up, 6 in Feb 1 04:59:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e223 do_prune osdmap full prune enabled Feb 1 04:59:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e224 e224: 6 total, 6 up, 6 in Feb 1 04:59:49 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e224: 6 total, 6 up, 6 in Feb 1 04:59:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 04:59:49 localhost podman[326404]: 2026-02-01 09:59:49.737043421 +0000 UTC m=+0.088519613 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, version=9.7, architecture=x86_64, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 1 04:59:49 localhost podman[326404]: 2026-02-01 09:59:49.74933516 +0000 UTC m=+0.100811332 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.) Feb 1 04:59:49 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 04:59:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 04:59:49 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1766604859' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 04:59:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 04:59:49 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1766604859' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 04:59:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e224 do_prune osdmap full prune enabled Feb 1 04:59:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e225 e225: 6 total, 6 up, 6 in Feb 1 04:59:50 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e225: 6 total, 6 up, 6 in Feb 1 04:59:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 04:59:50 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 04:59:51 localhost nova_compute[274651]: 2026-02-01 09:59:51.217 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:59:51 localhost nova_compute[274651]: 2026-02-01 09:59:51.219 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 04:59:51 localhost nova_compute[274651]: 2026-02-01 09:59:51.219 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 04:59:51 localhost nova_compute[274651]: 2026-02-01 09:59:51.219 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:59:51 localhost nova_compute[274651]: 2026-02-01 09:59:51.264 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:51 localhost nova_compute[274651]: 2026-02-01 09:59:51.265 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 04:59:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0. Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.812606) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991812686, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 544, "num_deletes": 258, "total_data_size": 436561, "memory_usage": 447832, "flush_reason": "Manual Compaction"} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991818751, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 430704, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34024, "largest_seqno": 34567, "table_properties": {"data_size": 427432, "index_size": 1127, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8498, "raw_average_key_size": 20, "raw_value_size": 420453, "raw_average_value_size": 1015, "num_data_blocks": 44, "num_entries": 414, "num_filter_entries": 414, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939981, "oldest_key_time": 1769939981, "file_creation_time": 1769939991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 6181 microseconds, and 2143 cpu microseconds. Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.818796) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 430704 bytes OK Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.818816) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.821181) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.821203) EVENT_LOG_v1 {"time_micros": 1769939991821196, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.821225) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 433275, prev total WAL file size 433599, number of live WAL files 2. Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.821927) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303133' seq:72057594037927935, type:22 .. '6C6F676D0034323635' seq:0, type:0; will stop at (end) Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(420KB)], [60(20MB)] Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991822026, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 22189362, "oldest_snapshot_seqno": -1} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 13111 keys, 21455591 bytes, temperature: kUnknown Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991931827, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 21455591, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21380153, "index_size": 41545, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32837, "raw_key_size": 353583, "raw_average_key_size": 26, "raw_value_size": 21156105, "raw_average_value_size": 1613, "num_data_blocks": 1548, "num_entries": 13111, "num_filter_entries": 13111, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769939991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.932202) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 21455591 bytes Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.934561) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 201.9 rd, 195.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 20.8 +0.0 blob) out(20.5 +0.0 blob), read-write-amplify(101.3) write-amplify(49.8) OK, records in: 13656, records dropped: 545 output_compression: NoCompression Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.934592) EVENT_LOG_v1 {"time_micros": 1769939991934579, "job": 36, "event": "compaction_finished", "compaction_time_micros": 109884, "compaction_time_cpu_micros": 61172, "output_level": 6, "num_output_files": 1, "total_output_size": 21455591, "num_input_records": 13656, "num_output_records": 13111, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991934811, "job": 36, "event": "table_file_deletion", "file_number": 62} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991937899, "job": 36, "event": "table_file_deletion", "file_number": 60} Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.821802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.938025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.938045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.938049) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.938052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:51 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-09:59:51.938055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 04:59:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e225 do_prune osdmap full prune enabled Feb 1 04:59:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e226 e226: 6 total, 6 up, 6 in Feb 1 04:59:53 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e226: 6 total, 6 up, 6 in Feb 1 04:59:53 localhost podman[236886]: time="2026-02-01T09:59:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 04:59:53 localhost podman[236886]: @ - - [01/Feb/2026:09:59:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 04:59:54 localhost podman[236886]: @ - - [01/Feb/2026:09:59:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18838 "" "Go-http-client/1.1" Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.375 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.375 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.376 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.376 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 04:59:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:54.439 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.440 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:54 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:54.441 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.820 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.842 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.843 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 04:59:54 localhost nova_compute[274651]: 2026-02-01 09:59:54.843 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:55 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Feb 1 04:59:56 localhost nova_compute[274651]: 2026-02-01 09:59:56.312 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 04:59:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 04:59:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e226 do_prune osdmap full prune enabled Feb 1 04:59:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e227 e227: 6 total, 6 up, 6 in Feb 1 04:59:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e227: 6 total, 6 up, 6 in Feb 1 04:59:57 localhost nova_compute[274651]: 2026-02-01 09:59:57.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:57 localhost nova_compute[274651]: 2026-02-01 09:59:57.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 04:59:57 localhost podman[326424]: 2026-02-01 09:59:57.728147545 +0000 UTC m=+0.089441792 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 04:59:57 localhost podman[326424]: 2026-02-01 09:59:57.736711599 +0000 UTC m=+0.098005836 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute) Feb 1 04:59:57 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 04:59:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e227 do_prune osdmap full prune enabled Feb 1 04:59:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e228 e228: 6 total, 6 up, 6 in Feb 1 04:59:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e228: 6 total, 6 up, 6 in Feb 1 04:59:58 localhost nova_compute[274651]: 2026-02-01 09:59:58.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:58 localhost nova_compute[274651]: 2026-02-01 09:59:58.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 04:59:58 localhost nova_compute[274651]: 2026-02-01 09:59:58.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 04:59:58 localhost ovn_metadata_agent[158360]: 2026-02-01 09:59:58.443 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 04:59:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e228 do_prune osdmap full prune enabled Feb 1 04:59:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e229 e229: 6 total, 6 up, 6 in Feb 1 04:59:59 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e229: 6 total, 6 up, 6 in Feb 1 05:00:00 localhost ceph-mon[286721]: log_channel(cluster) log [WRN] : overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 05:00:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:00 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e229 do_prune osdmap full prune enabled Feb 1 05:00:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e230 e230: 6 total, 6 up, 6 in Feb 1 05:00:00 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e230: 6 total, 6 up, 6 in Feb 1 05:00:00 localhost ceph-mon[286721]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 1 05:00:01 localhost nova_compute[274651]: 2026-02-01 10:00:01.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:01 localhost nova_compute[274651]: 2026-02-01 10:00:01.313 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:01 localhost nova_compute[274651]: 2026-02-01 10:00:01.315 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:01 localhost nova_compute[274651]: 2026-02-01 10:00:01.316 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:01 localhost nova_compute[274651]: 2026-02-01 10:00:01.316 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:01 localhost nova_compute[274651]: 2026-02-01 10:00:01.353 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:01 localhost nova_compute[274651]: 2026-02-01 10:00:01.354 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:01 localhost openstack_network_exporter[239441]: ERROR 10:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:00:01 localhost openstack_network_exporter[239441]: Feb 1 05:00:01 localhost openstack_network_exporter[239441]: ERROR 10:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:00:01 localhost openstack_network_exporter[239441]: Feb 1 05:00:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.293 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.294 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.294 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.295 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:00:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:00:02 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1636548431' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.819 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.892 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:00:02 localhost nova_compute[274651]: 2026-02-01 10:00:02.893 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.134 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.136 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11198MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.136 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.136 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.199 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.199 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.200 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.245 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:00:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:00:03 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/246736743' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.753 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.760 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.912 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.915 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:00:03 localhost nova_compute[274651]: 2026-02-01 10:00:03.915 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.779s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:00:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.355 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.358 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.358 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.358 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.359 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.362 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:00:06 localhost podman[326489]: 2026-02-01 10:00:06.710689893 +0000 UTC m=+0.067119765 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:00:06 localhost podman[326489]: 2026-02-01 10:00:06.74569344 +0000 UTC m=+0.102123292 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 05:00:06 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:00:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e230 do_prune osdmap full prune enabled Feb 1 05:00:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e231 e231: 6 total, 6 up, 6 in Feb 1 05:00:06 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e231: 6 total, 6 up, 6 in Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.912 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:06 localhost nova_compute[274651]: 2026-02-01 10:00:06.939 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:07 localhost ovn_controller[152492]: 2026-02-01T10:00:07Z|00477|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 1 05:00:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:00:07 localhost podman[326512]: 2026-02-01 10:00:07.737288419 +0000 UTC m=+0.093411344 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:00:07 localhost podman[326512]: 2026-02-01 10:00:07.770574773 +0000 UTC m=+0.126697678 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 05:00:07 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:00:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:09 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/765974474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:09 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/765974474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2901407308' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2901407308' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:11 localhost nova_compute[274651]: 2026-02-01 10:00:11.363 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:11 localhost nova_compute[274651]: 2026-02-01 10:00:11.364 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:11 localhost nova_compute[274651]: 2026-02-01 10:00:11.365 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:11 localhost nova_compute[274651]: 2026-02-01 10:00:11.365 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:11 localhost nova_compute[274651]: 2026-02-01 10:00:11.393 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:11 localhost nova_compute[274651]: 2026-02-01 10:00:11.394 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:11 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4179659550' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:11 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4179659550' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:00:12 localhost podman[326530]: 2026-02-01 10:00:12.726035869 +0000 UTC m=+0.086469611 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:00:12 localhost podman[326530]: 2026-02-01 10:00:12.741585237 +0000 UTC m=+0.102018969 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 05:00:12 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:00:14 localhost podman[326553]: 2026-02-01 10:00:14.968777249 +0000 UTC m=+1.331397511 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller) Feb 1 05:00:14 localhost podman[326553]: 2026-02-01 10:00:14.998537215 +0000 UTC m=+1.361157497 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:00:15 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:00:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e231 do_prune osdmap full prune enabled Feb 1 05:00:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e232 e232: 6 total, 6 up, 6 in Feb 1 05:00:16 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e232: 6 total, 6 up, 6 in Feb 1 05:00:16 localhost nova_compute[274651]: 2026-02-01 10:00:16.394 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:16 localhost nova_compute[274651]: 2026-02-01 10:00:16.395 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:16 localhost nova_compute[274651]: 2026-02-01 10:00:16.396 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:16 localhost nova_compute[274651]: 2026-02-01 10:00:16.396 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:16 localhost nova_compute[274651]: 2026-02-01 10:00:16.397 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:16 localhost nova_compute[274651]: 2026-02-01 10:00:16.400 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e232 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e232 do_prune osdmap full prune enabled Feb 1 05:00:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e233 e233: 6 total, 6 up, 6 in Feb 1 05:00:19 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e233: 6 total, 6 up, 6 in Feb 1 05:00:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e233 do_prune osdmap full prune enabled Feb 1 05:00:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e234 e234: 6 total, 6 up, 6 in Feb 1 05:00:20 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e234: 6 total, 6 up, 6 in Feb 1 05:00:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:00:20 localhost podman[326578]: 2026-02-01 10:00:20.720033193 +0000 UTC m=+0.083702976 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 05:00:20 localhost podman[326578]: 2026-02-01 10:00:20.735425776 +0000 UTC m=+0.099095539 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., distribution-scope=public, release=1769056855, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7) Feb 1 05:00:20 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:00:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:21 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:21 localhost nova_compute[274651]: 2026-02-01 10:00:21.400 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:21 localhost nova_compute[274651]: 2026-02-01 10:00:21.402 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:21 localhost nova_compute[274651]: 2026-02-01 10:00:21.403 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:21 localhost nova_compute[274651]: 2026-02-01 10:00:21.403 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:21 localhost nova_compute[274651]: 2026-02-01 10:00:21.419 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:21 localhost nova_compute[274651]: 2026-02-01 10:00:21.420 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:23 localhost podman[236886]: time="2026-02-01T10:00:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:00:23 localhost podman[236886]: @ - - [01/Feb/2026:10:00:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:00:23 localhost podman[236886]: @ - - [01/Feb/2026:10:00:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18848 "" "Go-http-client/1.1" Feb 1 05:00:26 localhost nova_compute[274651]: 2026-02-01 10:00:26.421 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:26 localhost nova_compute[274651]: 2026-02-01 10:00:26.423 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:26 localhost nova_compute[274651]: 2026-02-01 10:00:26.423 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:26 localhost nova_compute[274651]: 2026-02-01 10:00:26.423 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:26 localhost nova_compute[274651]: 2026-02-01 10:00:26.450 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:26 localhost nova_compute[274651]: 2026-02-01 10:00:26.450 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e234 do_prune osdmap full prune enabled Feb 1 05:00:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e235 e235: 6 total, 6 up, 6 in Feb 1 05:00:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e235: 6 total, 6 up, 6 in Feb 1 05:00:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:27 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e235 do_prune osdmap full prune enabled Feb 1 05:00:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e236 e236: 6 total, 6 up, 6 in Feb 1 05:00:27 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e236: 6 total, 6 up, 6 in Feb 1 05:00:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:28 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:00:28 localhost podman[326598]: 2026-02-01 10:00:28.719593588 +0000 UTC m=+0.078307200 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:00:28 localhost podman[326598]: 2026-02-01 10:00:28.734386202 +0000 UTC m=+0.093099864 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 05:00:28 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:00:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:29 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/184422499' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:29 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/184422499' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:30 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:30 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3625315047' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:30 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3625315047' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:31 localhost nova_compute[274651]: 2026-02-01 10:00:31.454 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:31 localhost nova_compute[274651]: 2026-02-01 10:00:31.456 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:31 localhost nova_compute[274651]: 2026-02-01 10:00:31.456 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:31 localhost nova_compute[274651]: 2026-02-01 10:00:31.456 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:31 localhost nova_compute[274651]: 2026-02-01 10:00:31.489 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:31 localhost nova_compute[274651]: 2026-02-01 10:00:31.490 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:31 localhost openstack_network_exporter[239441]: ERROR 10:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:00:31 localhost openstack_network_exporter[239441]: Feb 1 05:00:31 localhost openstack_network_exporter[239441]: ERROR 10:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:00:31 localhost openstack_network_exporter[239441]: Feb 1 05:00:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:31 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e236 do_prune osdmap full prune enabled Feb 1 05:00:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e237 e237: 6 total, 6 up, 6 in Feb 1 05:00:33 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e237: 6 total, 6 up, 6 in Feb 1 05:00:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:33 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1571698682' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1571698682' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:35 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:36 localhost nova_compute[274651]: 2026-02-01 10:00:36.491 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:36 localhost nova_compute[274651]: 2026-02-01 10:00:36.493 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:36 localhost nova_compute[274651]: 2026-02-01 10:00:36.493 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:36 localhost nova_compute[274651]: 2026-02-01 10:00:36.493 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:36 localhost nova_compute[274651]: 2026-02-01 10:00:36.522 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:36 localhost nova_compute[274651]: 2026-02-01 10:00:36.522 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e237 do_prune osdmap full prune enabled Feb 1 05:00:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e238 e238: 6 total, 6 up, 6 in Feb 1 05:00:36 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e238: 6 total, 6 up, 6 in Feb 1 05:00:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:00:37 localhost podman[326618]: 2026-02-01 10:00:37.358200697 +0000 UTC m=+0.084759778 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:00:37 localhost podman[326618]: 2026-02-01 10:00:37.368548416 +0000 UTC m=+0.095107547 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:00:37 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:00:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e238 do_prune osdmap full prune enabled Feb 1 05:00:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e239 e239: 6 total, 6 up, 6 in Feb 1 05:00:37 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e239: 6 total, 6 up, 6 in Feb 1 05:00:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:38 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:00:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:00:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:00:38 localhost podman[326641]: 2026-02-01 10:00:38.733547029 +0000 UTC m=+0.088821782 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:00:38 localhost podman[326641]: 2026-02-01 10:00:38.769491055 +0000 UTC m=+0.124765798 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 05:00:38 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:00:38 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 1 05:00:38 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:38 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:38 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:00:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e239 do_prune osdmap full prune enabled Feb 1 05:00:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e240 e240: 6 total, 6 up, 6 in Feb 1 05:00:39 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e240: 6 total, 6 up, 6 in Feb 1 05:00:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:41 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3458269461' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:41 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3458269461' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:41 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:41 localhost nova_compute[274651]: 2026-02-01 10:00:41.523 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:41 localhost nova_compute[274651]: 2026-02-01 10:00:41.524 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:41 localhost nova_compute[274651]: 2026-02-01 10:00:41.524 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:41 localhost nova_compute[274651]: 2026-02-01 10:00:41.524 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:41 localhost nova_compute[274651]: 2026-02-01 10:00:41.572 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:41 localhost nova_compute[274651]: 2026-02-01 10:00:41.572 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:41.722 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:00:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:41.722 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:00:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:41.723 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:00:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:00:43 localhost systemd[1]: tmp-crun.qwqW1f.mount: Deactivated successfully. Feb 1 05:00:43 localhost podman[326678]: 2026-02-01 10:00:43.466323997 +0000 UTC m=+0.105280160 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 05:00:43 localhost podman[326678]: 2026-02-01 10:00:43.477319885 +0000 UTC m=+0.116276058 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:00:43 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:00:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:43 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 05:00:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:00:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:44 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:44 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 05:00:44 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:00:44 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 1 05:00:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:00:45 localhost podman[326767]: 2026-02-01 10:00:45.741287968 +0000 UTC m=+0.093050253 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127) Feb 1 05:00:45 localhost podman[326767]: 2026-02-01 10:00:45.784578309 +0000 UTC m=+0.136340574 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 05:00:45 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:00:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 05:00:45 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 18K writes, 67K keys, 18K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s#012Cumulative WAL: 18K writes, 6173 syncs, 2.96 writes per sync, written: 0.05 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 12K writes, 41K keys, 12K commit groups, 1.0 writes per commit group, ingest: 28.67 MB, 0.05 MB/s#012Interval WAL: 12K writes, 5375 syncs, 2.32 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 05:00:46 localhost nova_compute[274651]: 2026-02-01 10:00:46.573 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:46 localhost nova_compute[274651]: 2026-02-01 10:00:46.609 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:00:46 localhost nova_compute[274651]: 2026-02-01 10:00:46.610 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:00:46 localhost nova_compute[274651]: 2026-02-01 10:00:46.610 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:46 localhost nova_compute[274651]: 2026-02-01 10:00:46.611 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:46 localhost nova_compute[274651]: 2026-02-01 10:00:46.611 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:00:46 localhost nova_compute[274651]: 2026-02-01 10:00:46.612 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 05:00:46 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:00:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e240 do_prune osdmap full prune enabled Feb 1 05:00:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e241 e241: 6 total, 6 up, 6 in Feb 1 05:00:46 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e241: 6 total, 6 up, 6 in Feb 1 05:00:47 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:00:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e241 do_prune osdmap full prune enabled Feb 1 05:00:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e242 e242: 6 total, 6 up, 6 in Feb 1 05:00:47 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e242: 6 total, 6 up, 6 in Feb 1 05:00:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:00:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:00:49 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:49.049 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:00:49 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:49.051 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 05:00:49 localhost nova_compute[274651]: 2026-02-01 10:00:49.085 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:00:49 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3652661870' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:00:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:00:49 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3652661870' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:00:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} : dispatch Feb 1 05:00:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:00:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:00:49.411 259320 INFO neutron.agent.linux.ip_lib [None req-88ec7c36-8565-48fb-903d-895478f601f7 - - - - - -] Device tapaf29aadf-e3 cannot be used as it has no MAC address#033[00m Feb 1 05:00:49 localhost nova_compute[274651]: 2026-02-01 10:00:49.437 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:49 localhost kernel: device tapaf29aadf-e3 entered promiscuous mode Feb 1 05:00:49 localhost NetworkManager[5964]: [1769940049.4479] manager: (tapaf29aadf-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/79) Feb 1 05:00:49 localhost nova_compute[274651]: 2026-02-01 10:00:49.450 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:49 localhost ovn_controller[152492]: 2026-02-01T10:00:49Z|00478|binding|INFO|Claiming lport af29aadf-e35a-4249-9f4f-9187da8af7c4 for this chassis. Feb 1 05:00:49 localhost systemd-udevd[326800]: Network interface NamePolicy= disabled on kernel command line. Feb 1 05:00:49 localhost ovn_controller[152492]: 2026-02-01T10:00:49Z|00479|binding|INFO|af29aadf-e35a-4249-9f4f-9187da8af7c4: Claiming unknown Feb 1 05:00:49 localhost nova_compute[274651]: 2026-02-01 10:00:49.456 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:49 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:49.461 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-e3cfe08b-56b5-4442-afd3-57110a8d1cf9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3cfe08b-56b5-4442-afd3-57110a8d1cf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f80f1b0657846c89a7808fa81feb44c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=375fcfe7-f7ed-4152-bd4b-b3cfbaefb8a0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=af29aadf-e35a-4249-9f4f-9187da8af7c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:00:49 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:49.463 158365 INFO neutron.agent.ovn.metadata.agent [-] Port af29aadf-e35a-4249-9f4f-9187da8af7c4 in datapath e3cfe08b-56b5-4442-afd3-57110a8d1cf9 bound to our chassis#033[00m Feb 1 05:00:49 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:49.469 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3b39cfd5-53d5-406e-9a1c-1e572e66c5a9 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 05:00:49 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:49.469 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3cfe08b-56b5-4442-afd3-57110a8d1cf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:00:49 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:49.470 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a5c0ea-67b0-4ffd-9568-849e497ef48c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost nova_compute[274651]: 2026-02-01 10:00:49.480 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:49 localhost ovn_controller[152492]: 2026-02-01T10:00:49Z|00480|binding|INFO|Setting lport af29aadf-e35a-4249-9f4f-9187da8af7c4 ovn-installed in OVS Feb 1 05:00:49 localhost ovn_controller[152492]: 2026-02-01T10:00:49Z|00481|binding|INFO|Setting lport af29aadf-e35a-4249-9f4f-9187da8af7c4 up in Southbound Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost journal[217584]: ethtool ioctl error on tapaf29aadf-e3: No such device Feb 1 05:00:49 localhost nova_compute[274651]: 2026-02-01 10:00:49.521 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:49 localhost nova_compute[274651]: 2026-02-01 10:00:49.552 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 05:00:49 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 15K writes, 59K keys, 15K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s#012Cumulative WAL: 15K writes, 5199 syncs, 2.97 writes per sync, written: 0.05 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 36K keys, 10K commit groups, 1.0 writes per commit group, ingest: 27.82 MB, 0.05 MB/s#012Interval WAL: 10K writes, 4329 syncs, 2.35 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 1 05:00:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:50 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:50 localhost podman[326871]: Feb 1 05:00:50 localhost podman[326871]: 2026-02-01 10:00:50.474173718 +0000 UTC m=+0.090384400 container create 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 05:00:50 localhost nova_compute[274651]: 2026-02-01 10:00:50.514 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:50 localhost systemd[1]: Started libpod-conmon-0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081.scope. Feb 1 05:00:50 localhost podman[326871]: 2026-02-01 10:00:50.432436085 +0000 UTC m=+0.048646817 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 05:00:50 localhost systemd[1]: Started libcrun container. Feb 1 05:00:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bde68ce32cde4edeb797b9689e3d471ba373e9a71eff705c4ee2ea88066b9419/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 05:00:50 localhost podman[326871]: 2026-02-01 10:00:50.558151711 +0000 UTC m=+0.174362393 container init 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3) Feb 1 05:00:50 localhost podman[326871]: 2026-02-01 10:00:50.567160858 +0000 UTC m=+0.183371530 container start 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 1 05:00:50 localhost dnsmasq[326889]: started, version 2.85 cachesize 150 Feb 1 05:00:50 localhost dnsmasq[326889]: DNS service limited to local subnets Feb 1 05:00:50 localhost dnsmasq[326889]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 05:00:50 localhost dnsmasq[326889]: warning: no upstream servers configured Feb 1 05:00:50 localhost dnsmasq-dhcp[326889]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 05:00:50 localhost dnsmasq[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/addn_hosts - 0 addresses Feb 1 05:00:50 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/host Feb 1 05:00:50 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/opts Feb 1 05:00:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:00:50.726 259320 INFO neutron.agent.dhcp.agent [None req-898055c9-2cf2-4e36-a527-5f27d1b4aa23 - - - - - -] DHCP configuration for ports {'e3a1ae3f-96cf-4620-8508-a69fe064867d'} is completed#033[00m Feb 1 05:00:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:00:51.044 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:00:50Z, description=, device_id=cc429be4-9929-4d24-b8d5-e9a71d01dbb9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=08595140-bac5-4398-8521-64695d510d33, ip_allocation=immediate, mac_address=fa:16:3e:b0:63:f3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:00:46Z, description=, dns_domain=, id=e3cfe08b-56b5-4442-afd3-57110a8d1cf9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1119220909-network, port_security_enabled=True, project_id=3f80f1b0657846c89a7808fa81feb44c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28826, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3201, status=ACTIVE, subnets=['313d1a7a-6440-4248-9703-639a850a0c4b'], tags=[], tenant_id=3f80f1b0657846c89a7808fa81feb44c, updated_at=2026-02-01T10:00:47Z, vlan_transparent=None, network_id=e3cfe08b-56b5-4442-afd3-57110a8d1cf9, port_security_enabled=False, project_id=3f80f1b0657846c89a7808fa81feb44c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3209, status=DOWN, tags=[], tenant_id=3f80f1b0657846c89a7808fa81feb44c, updated_at=2026-02-01T10:00:50Z on network e3cfe08b-56b5-4442-afd3-57110a8d1cf9#033[00m Feb 1 05:00:51 localhost dnsmasq[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/addn_hosts - 1 addresses Feb 1 05:00:51 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/host Feb 1 05:00:51 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/opts Feb 1 05:00:51 localhost podman[326905]: 2026-02-01 10:00:51.251812497 +0000 UTC m=+0.062268796 container kill 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:00:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e242 do_prune osdmap full prune enabled Feb 1 05:00:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e243 e243: 6 total, 6 up, 6 in Feb 1 05:00:51 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e243: 6 total, 6 up, 6 in Feb 1 05:00:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:00:51 localhost podman[326924]: 2026-02-01 10:00:51.488159656 +0000 UTC m=+0.091970470 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, distribution-scope=public, release=1769056855) Feb 1 05:00:51 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:00:51.504 259320 INFO neutron.agent.dhcp.agent [None req-718e459f-0949-4fcb-9a72-ec922b73903f - - - - - -] DHCP configuration for ports {'08595140-bac5-4398-8521-64695d510d33'} is completed#033[00m Feb 1 05:00:51 localhost podman[326924]: 2026-02-01 10:00:51.529415505 +0000 UTC m=+0.133226289 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 05:00:51 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:00:51 localhost nova_compute[274651]: 2026-02-01 10:00:51.650 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:51 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=Joe,client_metadata.root=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464],prefix=session evict} (starting...) Feb 1 05:00:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:00:52.081 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:00:50Z, description=, device_id=cc429be4-9929-4d24-b8d5-e9a71d01dbb9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=08595140-bac5-4398-8521-64695d510d33, ip_allocation=immediate, mac_address=fa:16:3e:b0:63:f3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:00:46Z, description=, dns_domain=, id=e3cfe08b-56b5-4442-afd3-57110a8d1cf9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1119220909-network, port_security_enabled=True, project_id=3f80f1b0657846c89a7808fa81feb44c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28826, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3201, status=ACTIVE, subnets=['313d1a7a-6440-4248-9703-639a850a0c4b'], tags=[], tenant_id=3f80f1b0657846c89a7808fa81feb44c, updated_at=2026-02-01T10:00:47Z, vlan_transparent=None, network_id=e3cfe08b-56b5-4442-afd3-57110a8d1cf9, port_security_enabled=False, project_id=3f80f1b0657846c89a7808fa81feb44c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3209, status=DOWN, tags=[], tenant_id=3f80f1b0657846c89a7808fa81feb44c, updated_at=2026-02-01T10:00:50Z on network e3cfe08b-56b5-4442-afd3-57110a8d1cf9#033[00m Feb 1 05:00:52 localhost dnsmasq[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/addn_hosts - 1 addresses Feb 1 05:00:52 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/host Feb 1 05:00:52 localhost podman[326964]: 2026-02-01 10:00:52.31511165 +0000 UTC m=+0.054955650 container kill 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 05:00:52 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/opts Feb 1 05:00:52 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:00:52.547 259320 INFO neutron.agent.dhcp.agent [None req-5587e523-9316-40cd-9d2b-0fb32673cc5d - - - - - -] DHCP configuration for ports {'08595140-bac5-4398-8521-64695d510d33'} is completed#033[00m Feb 1 05:00:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e243 do_prune osdmap full prune enabled Feb 1 05:00:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:00:53 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:00:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e244 e244: 6 total, 6 up, 6 in Feb 1 05:00:53 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e244: 6 total, 6 up, 6 in Feb 1 05:00:53 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:00:53 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:53 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:00:53 localhost podman[236886]: time="2026-02-01T10:00:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:00:53 localhost podman[236886]: @ - - [01/Feb/2026:10:00:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158361 "" "Go-http-client/1.1" Feb 1 05:00:54 localhost podman[236886]: @ - - [01/Feb/2026:10:00:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19316 "" "Go-http-client/1.1" Feb 1 05:00:54 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} : dispatch Feb 1 05:00:54 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:54 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:00:54 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:00:54 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} v 0) Feb 1 05:00:54 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} : dispatch Feb 1 05:00:54 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"}]': finished Feb 1 05:00:54 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-397577304,client_metadata.root=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12],prefix=session evict} (starting...) Feb 1 05:00:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} v 0) Feb 1 05:00:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} : dispatch Feb 1 05:00:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"}]': finished Feb 1 05:00:55 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2026360705,client_metadata.root=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464],prefix=session evict} (starting...) Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.333 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.333 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.334 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.334 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} : dispatch Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} : dispatch Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} : dispatch Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"}]': finished Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} : dispatch Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} : dispatch Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} : dispatch Feb 1 05:00:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"}]': finished Feb 1 05:00:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e244 do_prune osdmap full prune enabled Feb 1 05:00:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e245 e245: 6 total, 6 up, 6 in Feb 1 05:00:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e245: 6 total, 6 up, 6 in Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.830 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.854 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.855 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.855 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:55 localhost nova_compute[274651]: 2026-02-01 10:00:55.856 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e245 do_prune osdmap full prune enabled Feb 1 05:00:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e246 e246: 6 total, 6 up, 6 in Feb 1 05:00:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e246: 6 total, 6 up, 6 in Feb 1 05:00:56 localhost nova_compute[274651]: 2026-02-01 10:00:56.565 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:56 localhost nova_compute[274651]: 2026-02-01 10:00:56.658 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:00:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e246 do_prune osdmap full prune enabled Feb 1 05:00:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e247 e247: 6 total, 6 up, 6 in Feb 1 05:00:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e247: 6 total, 6 up, 6 in Feb 1 05:00:57 localhost nova_compute[274651]: 2026-02-01 10:00:57.864 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:58 localhost ovn_metadata_agent[158360]: 2026-02-01 10:00:58.053 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 05:00:58 localhost nova_compute[274651]: 2026-02-01 10:00:58.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0) Feb 1 05:00:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Feb 1 05:00:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Feb 1 05:00:58 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=Joe,client_metadata.root=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e],prefix=session evict} (starting...) Feb 1 05:00:59 localhost nova_compute[274651]: 2026-02-01 10:00:59.169 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:00:59 localhost nova_compute[274651]: 2026-02-01 10:00:59.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:00:59 localhost nova_compute[274651]: 2026-02-01 10:00:59.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 05:00:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Feb 1 05:00:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 1 05:00:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Feb 1 05:00:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Feb 1 05:00:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e247 do_prune osdmap full prune enabled Feb 1 05:00:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e248 e248: 6 total, 6 up, 6 in Feb 1 05:00:59 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e248: 6 total, 6 up, 6 in Feb 1 05:00:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:00:59 localhost podman[326985]: 2026-02-01 10:00:59.738629537 +0000 UTC m=+0.098673765 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:00:59 localhost podman[326985]: 2026-02-01 10:00:59.748600535 +0000 UTC m=+0.108644773 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible) Feb 1 05:00:59 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:01:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:00 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:00 localhost nova_compute[274651]: 2026-02-01 10:01:00.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:00 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:01 localhost openstack_network_exporter[239441]: ERROR 10:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:01:01 localhost openstack_network_exporter[239441]: Feb 1 05:01:01 localhost openstack_network_exporter[239441]: ERROR 10:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:01:01 localhost openstack_network_exporter[239441]: Feb 1 05:01:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e248 do_prune osdmap full prune enabled Feb 1 05:01:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e249 e249: 6 total, 6 up, 6 in Feb 1 05:01:01 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e249: 6 total, 6 up, 6 in Feb 1 05:01:01 localhost nova_compute[274651]: 2026-02-01 10:01:01.657 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:01 localhost nova_compute[274651]: 2026-02-01 10:01:01.664 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e249 do_prune osdmap full prune enabled Feb 1 05:01:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e250 e250: 6 total, 6 up, 6 in Feb 1 05:01:01 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e250: 6 total, 6 up, 6 in Feb 1 05:01:02 localhost systemd[1]: tmp-crun.RC9LmN.mount: Deactivated successfully. Feb 1 05:01:02 localhost dnsmasq[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/addn_hosts - 0 addresses Feb 1 05:01:02 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/host Feb 1 05:01:02 localhost dnsmasq-dhcp[326889]: read /var/lib/neutron/dhcp/e3cfe08b-56b5-4442-afd3-57110a8d1cf9/opts Feb 1 05:01:02 localhost podman[327031]: 2026-02-01 10:01:02.100469761 +0000 UTC m=+0.068485597 container kill 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:01:02 localhost nova_compute[274651]: 2026-02-01 10:01:02.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:02 localhost nova_compute[274651]: 2026-02-01 10:01:02.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:02 localhost nova_compute[274651]: 2026-02-01 10:01:02.272 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 05:01:02 localhost nova_compute[274651]: 2026-02-01 10:01:02.290 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 05:01:02 localhost nova_compute[274651]: 2026-02-01 10:01:02.340 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:02 localhost ovn_controller[152492]: 2026-02-01T10:01:02Z|00482|binding|INFO|Releasing lport af29aadf-e35a-4249-9f4f-9187da8af7c4 from this chassis (sb_readonly=0) Feb 1 05:01:02 localhost kernel: device tapaf29aadf-e3 left promiscuous mode Feb 1 05:01:02 localhost ovn_controller[152492]: 2026-02-01T10:01:02Z|00483|binding|INFO|Setting lport af29aadf-e35a-4249-9f4f-9187da8af7c4 down in Southbound Feb 1 05:01:02 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:02.353 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-e3cfe08b-56b5-4442-afd3-57110a8d1cf9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3cfe08b-56b5-4442-afd3-57110a8d1cf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3f80f1b0657846c89a7808fa81feb44c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=375fcfe7-f7ed-4152-bd4b-b3cfbaefb8a0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=af29aadf-e35a-4249-9f4f-9187da8af7c4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:01:02 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:02.356 158365 INFO neutron.agent.ovn.metadata.agent [-] Port af29aadf-e35a-4249-9f4f-9187da8af7c4 in datapath e3cfe08b-56b5-4442-afd3-57110a8d1cf9 unbound from our chassis#033[00m Feb 1 05:01:02 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:02.358 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3cfe08b-56b5-4442-afd3-57110a8d1cf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:01:02 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:02.360 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[54f9b132-1201-4e11-880d-023125136170]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:01:02 localhost nova_compute[274651]: 2026-02-01 10:01:02.369 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:02 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.289 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.315 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.315 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.316 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.316 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.317 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.530 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.562 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.564 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '789b5641-2643-4120-bc60-0b4a0f866d76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.531854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb211fec-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '837f5f8637c473e37dbcce893f355c5be6eca845867bb1c50265b550947ce17b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.531854', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb213b30-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '885eff13a6c9884cf79b2a8c4b1c6a2de071a4f09b07fd0568f9dd547d20d79b'}]}, 'timestamp': '2026-02-01 10:01:03.564726', '_unique_id': 'faf81c6a780c4f69ac262cac80a4f57f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.566 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.567 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.567 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.568 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '204125d1-f5b4-418c-bd86-7a18c75faeb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.567876', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb21ce4c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '39a449709be40fc4b838a5ae3a0625c6cb6bcd89299dd72e54e5b03fe571336b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.567876', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb21e260-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '2cf5dc31099b05a5aa4c078a578c93ff8b219181234ee4f0eec26e2af26163ba'}]}, 'timestamp': '2026-02-01 10:01:03.568979', '_unique_id': '44795a89747f44298b40ed906d63d39e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.570 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.571 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.577 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30d4eec3-7fa1-478f-b3d5-f1255cf2d7d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.571399', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb235802-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': 'c7eefbba8c856095a76ebd81590957d6ce6ed3bf33dcb0b9ae050781979d0e70'}]}, 'timestamp': '2026-02-01 10:01:03.578588', '_unique_id': 'e89ee562879849f489d7e6a9bd768cbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.579 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70f42868-b3e0-48aa-9b18-4dfc155c14f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.581078', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb23d098-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': 'f4bb6c909f726a05c78d414f1dea1c9747a4b7a5515a1061fb54fef9fdc94f26'}]}, 'timestamp': '2026-02-01 10:01:03.581696', '_unique_id': 'ea5d6c4c2f524b00885dd2b084249835'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a2c64ac-1cfa-4b6f-9d54-b281fc05464c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.584048', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb24449c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': '49b1a08604a78aa32c93c410f9eb4eda2a74182ec62a4542ccad56dbd41addb7'}]}, 'timestamp': '2026-02-01 10:01:03.584657', '_unique_id': '8caedab970664f4e8d35d960c11cdc10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.586 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.587 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41628ea6-7d79-4989-af8e-8d88ef0d3334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.587219', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb269990-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.80670108, 'message_signature': '2482fe1cd52be6182c46b0fe460863f34a004d7006d82034d9c370a75bd86995'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.587219', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb26af2a-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.80670108, 'message_signature': '58e8e0e84aac50ed56b0e9da2cbf3f1f98936a39b1bb0f1e1dbc58a4d98034ec'}]}, 'timestamp': '2026-02-01 10:01:03.600444', '_unique_id': '815f979c228d4872ae31bdbb1c5ff11f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.602 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a4adcfd-7d77-4139-8682-e5dff400317f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.603101', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb272cc0-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': '18a0fa8b55323ef3d4371486b511eafd16bd4cb12c64b59e3c68a0fe28373a4f'}]}, 'timestamp': '2026-02-01 10:01:03.603727', '_unique_id': '771098e30dc74778a3aded85106389a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.606 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '356a12bc-8334-4d81-b690-fe5d0c317413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.606411', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb27b3e8-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': '9fbe8cc0a1063146f72980a86ff4c4d454f810d1c5c31666b967c18279dc389c'}]}, 'timestamp': '2026-02-01 10:01:03.607234', '_unique_id': '4bcb2b06d25249ebb84c8116d71102d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.609 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 17510000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e522e2a-c706-4a52-bc29-6e4cd1aab907', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17510000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:01:03.609861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'eb2c1faa-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.854747648, 'message_signature': 'a0698350a538aa4387dce11f9e220ef717fbe378627291136b7852a44ec7f17b'}]}, 'timestamp': '2026-02-01 10:01:03.636265', '_unique_id': '399e83e08f6a4604b2c9dedb95f16400'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.637 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba04acd8-5a43-4119-a124-f6d5238b15e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.639115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb2cab64-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '1f96c2df633108bf3f6da5e030494ad986b2796d0b1fbd35ef1970b312fd36c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.639115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb2cbf32-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': 'ea162cb5a6181dc2c3fcc84b74e0794d44e6102f0d7be58bdb385aac33d911ec'}]}, 'timestamp': '2026-02-01 10:01:03.640213', '_unique_id': 'e3d36f8aff884292bdedca65255dcb5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.642 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2cb779c-0b76-48ba-9a53-064b9a5d93d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.642559', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb2d31a6-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': '1f4a772cf0e0b740bf04cba1b70dbf1e9050f33996710cf22d8a0dfe7f810c96'}]}, 'timestamp': '2026-02-01 10:01:03.643203', '_unique_id': '20397b5b44aa4a62a3b04cefe1577ef3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.644 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.645 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.645 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.646 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3683c88c-4a0e-49a8-9410-845edc87527c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.645459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb2da29e-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.80670108, 'message_signature': 'aac9d0950e42d6c2163921c0799074f469c67b24026edb34b0837e59dea32348'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.645459', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb2db98c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.80670108, 'message_signature': '54641fcfb8bfef0b9c4c4684744811b0603c13a914f7a8734009c7d45409fc57'}]}, 'timestamp': '2026-02-01 10:01:03.646581', '_unique_id': '1de2803a9a53427f9442cf33c8a41442'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.648 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.648 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8816869-edce-4036-8d25-24992f96e273', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.648863', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb2e2980-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': 'ccefe142bc08145b13c781dd58923dcb4104201bf9a6bfb1416a972ba116ef0a'}]}, 'timestamp': '2026-02-01 10:01:03.649499', '_unique_id': '20ad9412cfb1454d996219256b1b60ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.650 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.651 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.651 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ab8ef31-73db-49a4-b64a-c411d0d2fa99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.651859', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb2e9e6a-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': 'fa8df12a7be4b13083b7d98985eb181a08471f3a64b3746b3cca49940f74cf36'}]}, 'timestamp': '2026-02-01 10:01:03.652497', '_unique_id': '28242266c5944c568fc27e8278be72e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.654 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.654 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.655 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1af37f45-e381-4e54-b082-4966e4ecc9ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.654755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb2f0eea-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.80670108, 'message_signature': '55d38cec068fc1fff7a10be931ce023981a6d92d032f1d3127187c44c2c0d307'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.654755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb2f2358-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.80670108, 'message_signature': '215203fa642f274594a281a66fa565df459d86dd98df7bdb6f56c215d2391a02'}]}, 'timestamp': '2026-02-01 10:01:03.655835', '_unique_id': '2b751ea201784726b687ea57d52578fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.657 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.658 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd95a571d-0330-487c-b76b-7e4dbdadba54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.658152', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb2f93e2-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': '0092b0ae6530ab86365b089d70630b99453ddfa197a61607beb9ea0e19999750'}]}, 'timestamp': '2026-02-01 10:01:03.658789', '_unique_id': '9b35ebef0ef4411fbb65bc022f68aa33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.659 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.660 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.661 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.661 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbb6b60b-4ae0-46b4-b965-95df8c0c6c2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.661063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb300444-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': 'ea4b960d4cb4e46ce26d0b007518391680b2f0385f353f8bf2c3a014287efb58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.661063', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb3018b2-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '2cafeb6e8df1f26d5f54427627a8f13518617fa2cb1695ecafa6445eba8f0a89'}]}, 'timestamp': '2026-02-01 10:01:03.662176', '_unique_id': '2e6ceae79c6c4f08af942c1fceaf568d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.663 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.664 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.664 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39de6308-0101-487b-826a-85f6875be261', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:01:03.664492', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'eb308a36-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.790884914, 'message_signature': 'd06b9da1a671fa7ee410dacfb2a34a6eb29b1b20eca51c979e6aecd296e77389'}]}, 'timestamp': '2026-02-01 10:01:03.665143', '_unique_id': 'eb006926db494f8f87a19467ebccc3cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.666 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.667 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.667 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1553a3d1-34ce-4265-a961-86ff27b8d024', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:01:03.667407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'eb30fb9c-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.854747648, 'message_signature': '76b3c61e0902e95d939861107734f842545e1eb001b83c5fef0979c959349842'}]}, 'timestamp': '2026-02-01 10:01:03.668058', '_unique_id': '11b08030e48745fbb0472157077df0e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.668 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.670 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.670 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '791d0da9-8508-43e2-a1ce-022c7aaf3f74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.670290', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb316c76-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '835203de7aef02a4c601ed24d5243963f65fbdc5a35365e8985e3e700bf0211e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.670290', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb318404-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '01697734ca19557d3dcd1a39252093bf2cb489a3d87c3952bafaa74eb7387ae3'}]}, 'timestamp': '2026-02-01 10:01:03.671420', '_unique_id': 'ed64e1f6d68047298c16596f372a769c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.672 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.673 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.673 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.673 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.674 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9df793d6-964c-4a1f-b4a0-aa816ce8ce0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:01:03.673909', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'eb31fc86-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': 'e0fddde0b68cc45dd66eebc42718b0e6f590b5501cee419c635aa84b8ec2fd76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:01:03.673909', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'eb320dc0-ff54-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12157.751316836, 'message_signature': '5c44c7ea84e4a0cc70f8e24e48374004a37814cc179f93c631c6819b6dcc3def'}]}, 'timestamp': '2026-02-01 10:01:03.674870', '_unique_id': 'b451ee6d089947e8a7a454c129b2cb39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.675 12 ERROR oslo_messaging.notify.messaging Feb 1 05:01:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:01:03.676 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:01:03 localhost ovn_controller[152492]: 2026-02-01T10:01:03Z|00484|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.761 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:03 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:01:03 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2319460251' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:01:03 localhost nova_compute[274651]: 2026-02-01 10:01:03.854 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.044 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.044 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.252 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.254 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11187MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.254 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.255 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.665 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.666 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:01:04 localhost nova_compute[274651]: 2026-02-01 10:01:04.666 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:01:04 localhost dnsmasq[326889]: exiting on receipt of SIGTERM Feb 1 05:01:04 localhost podman[327094]: 2026-02-01 10:01:04.807327477 +0000 UTC m=+0.053393123 container kill 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 05:01:04 localhost systemd[1]: tmp-crun.vgVlCe.mount: Deactivated successfully. Feb 1 05:01:04 localhost systemd[1]: libpod-0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081.scope: Deactivated successfully. Feb 1 05:01:04 localhost podman[327107]: 2026-02-01 10:01:04.856908352 +0000 UTC m=+0.038111823 container died 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 05:01:04 localhost podman[327107]: 2026-02-01 10:01:04.923124478 +0000 UTC m=+0.104327969 container cleanup 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true) Feb 1 05:01:04 localhost systemd[1]: libpod-conmon-0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081.scope: Deactivated successfully. Feb 1 05:01:05 localhost podman[327110]: 2026-02-01 10:01:05.003616574 +0000 UTC m=+0.175334044 container remove 0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e3cfe08b-56b5-4442-afd3-57110a8d1cf9, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 05:01:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:05.059 259320 INFO neutron.agent.dhcp.agent [None req-15d6c0f1-8280-4ed9-adab-444f4154d7fa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:01:05 localhost nova_compute[274651]: 2026-02-01 10:01:05.217 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 05:01:05 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:05.250 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:01:05 localhost nova_compute[274651]: 2026-02-01 10:01:05.304 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 05:01:05 localhost nova_compute[274651]: 2026-02-01 10:01:05.305 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 05:01:05 localhost nova_compute[274651]: 2026-02-01 10:01:05.320 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 05:01:05 localhost nova_compute[274651]: 2026-02-01 10:01:05.355 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI2,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 05:01:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:01:05 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:01:05 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:01:05 localhost nova_compute[274651]: 2026-02-01 10:01:05.535 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:01:05 localhost systemd[1]: var-lib-containers-storage-overlay-bde68ce32cde4edeb797b9689e3d471ba373e9a71eff705c4ee2ea88066b9419-merged.mount: Deactivated successfully. Feb 1 05:01:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a955357bc36ffce98ab7f772a14900ba8166262c6b681b7dd23449d6894e081-userdata-shm.mount: Deactivated successfully. Feb 1 05:01:05 localhost systemd[1]: run-netns-qdhcp\x2de3cfe08b\x2d56b5\x2d4442\x2dafd3\x2d57110a8d1cf9.mount: Deactivated successfully. Feb 1 05:01:06 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 1 05:01:06 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:01:06 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:01:06 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:01:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:01:06 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4281456669' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.103 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.108 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.131 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.132 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.133 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.133 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.133 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 05:01:06 localhost nova_compute[274651]: 2026-02-01 10:01:06.708 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:07 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:01:07 localhost systemd[1]: tmp-crun.g2ZLYM.mount: Deactivated successfully. Feb 1 05:01:07 localhost podman[327157]: 2026-02-01 10:01:07.744346411 +0000 UTC m=+0.099078218 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:01:07 localhost podman[327157]: 2026-02-01 10:01:07.781325128 +0000 UTC m=+0.136056925 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:01:07 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:01:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:08 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:08 localhost sshd[327179]: main: sshd: ssh-rsa algorithm is disabled Feb 1 05:01:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e250 do_prune osdmap full prune enabled Feb 1 05:01:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e251 e251: 6 total, 6 up, 6 in Feb 1 05:01:09 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e251: 6 total, 6 up, 6 in Feb 1 05:01:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:01:09 localhost podman[327181]: 2026-02-01 10:01:09.737093512 +0000 UTC m=+0.092490846 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 05:01:09 localhost podman[327181]: 2026-02-01 10:01:09.772420819 +0000 UTC m=+0.127818143 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 05:01:09 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:01:10 localhost nova_compute[274651]: 2026-02-01 10:01:10.146 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:11 localhost nova_compute[274651]: 2026-02-01 10:01:11.710 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:11 localhost nova_compute[274651]: 2026-02-01 10:01:11.712 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:11 localhost nova_compute[274651]: 2026-02-01 10:01:11.713 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:01:11 localhost nova_compute[274651]: 2026-02-01 10:01:11.713 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:11 localhost nova_compute[274651]: 2026-02-01 10:01:11.750 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:11 localhost nova_compute[274651]: 2026-02-01 10:01:11.751 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:01:12 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4022670018' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:01:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:01:12 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4022670018' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:01:12 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 1 05:01:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:01:13 localhost podman[327199]: 2026-02-01 10:01:13.721301246 +0000 UTC m=+0.080512078 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 05:01:13 localhost podman[327199]: 2026-02-01 10:01:13.728343082 +0000 UTC m=+0.087553934 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:01:13 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:01:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e251 do_prune osdmap full prune enabled Feb 1 05:01:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e252 e252: 6 total, 6 up, 6 in Feb 1 05:01:14 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e252: 6 total, 6 up, 6 in Feb 1 05:01:15 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=david,client_metadata.root=/volumes/_nogroup/c1ec6001-c4f0-42e8-a3ae-66c185a36061/f886650d-b51c-4718-ba32-df2300a26036],prefix=session evict} (starting...) Feb 1 05:01:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:01:16 localhost podman[327222]: 2026-02-01 10:01:16.726116865 +0000 UTC m=+0.089446672 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 05:01:16 localhost nova_compute[274651]: 2026-02-01 10:01:16.749 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:16 localhost nova_compute[274651]: 2026-02-01 10:01:16.752 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:16 localhost podman[327222]: 2026-02-01 10:01:16.771547073 +0000 UTC m=+0.134876880 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3) Feb 1 05:01:16 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:01:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e252 do_prune osdmap full prune enabled Feb 1 05:01:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e253 e253: 6 total, 6 up, 6 in Feb 1 05:01:16 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e253: 6 total, 6 up, 6 in Feb 1 05:01:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0) Feb 1 05:01:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Feb 1 05:01:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Feb 1 05:01:18 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=david,client_metadata.root=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0],prefix=session evict} (starting...) Feb 1 05:01:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:18 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e253 do_prune osdmap full prune enabled Feb 1 05:01:19 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Feb 1 05:01:19 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 1 05:01:19 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Feb 1 05:01:19 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Feb 1 05:01:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e254 e254: 6 total, 6 up, 6 in Feb 1 05:01:19 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e254: 6 total, 6 up, 6 in Feb 1 05:01:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e254 do_prune osdmap full prune enabled Feb 1 05:01:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e255 e255: 6 total, 6 up, 6 in Feb 1 05:01:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e255: 6 total, 6 up, 6 in Feb 1 05:01:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:01:21 localhost systemd[1]: tmp-crun.aLydvb.mount: Deactivated successfully. Feb 1 05:01:21 localhost podman[327247]: 2026-02-01 10:01:21.74104642 +0000 UTC m=+0.097402246 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, release=1769056855, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal) Feb 1 05:01:21 localhost podman[327247]: 2026-02-01 10:01:21.754464863 +0000 UTC m=+0.110820689 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1769056855, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-type=git, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible) Feb 1 05:01:21 localhost nova_compute[274651]: 2026-02-01 10:01:21.754 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:21 localhost nova_compute[274651]: 2026-02-01 10:01:21.756 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:21 localhost nova_compute[274651]: 2026-02-01 10:01:21.757 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:01:21 localhost nova_compute[274651]: 2026-02-01 10:01:21.757 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:21 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:01:21 localhost nova_compute[274651]: 2026-02-01 10:01:21.786 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:21 localhost nova_compute[274651]: 2026-02-01 10:01:21.787 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e255 do_prune osdmap full prune enabled Feb 1 05:01:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e256 e256: 6 total, 6 up, 6 in Feb 1 05:01:21 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e256: 6 total, 6 up, 6 in Feb 1 05:01:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:22 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:22 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e256 do_prune osdmap full prune enabled Feb 1 05:01:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e257 e257: 6 total, 6 up, 6 in Feb 1 05:01:22 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e257: 6 total, 6 up, 6 in Feb 1 05:01:23 localhost sshd[327267]: main: sshd: ssh-rsa algorithm is disabled Feb 1 05:01:23 localhost podman[236886]: time="2026-02-01T10:01:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:01:23 localhost podman[236886]: @ - - [01/Feb/2026:10:01:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:01:24 localhost podman[236886]: @ - - [01/Feb/2026:10:01:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18849 "" "Go-http-client/1.1" Feb 1 05:01:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e257 do_prune osdmap full prune enabled Feb 1 05:01:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e258 e258: 6 total, 6 up, 6 in Feb 1 05:01:25 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e258: 6 total, 6 up, 6 in Feb 1 05:01:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:01:25 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4212286376' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:01:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:01:25 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4212286376' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:01:26 localhost nova_compute[274651]: 2026-02-01 10:01:26.788 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:26 localhost nova_compute[274651]: 2026-02-01 10:01:26.790 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:26 localhost nova_compute[274651]: 2026-02-01 10:01:26.791 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:01:26 localhost nova_compute[274651]: 2026-02-01 10:01:26.791 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:26 localhost nova_compute[274651]: 2026-02-01 10:01:26.815 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:26 localhost nova_compute[274651]: 2026-02-01 10:01:26.816 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e258 do_prune osdmap full prune enabled Feb 1 05:01:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e259 e259: 6 total, 6 up, 6 in Feb 1 05:01:26 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e259: 6 total, 6 up, 6 in Feb 1 05:01:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:28 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:01:28 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/663981012' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:01:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:01:28 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/663981012' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:01:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e259 do_prune osdmap full prune enabled Feb 1 05:01:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e260 e260: 6 total, 6 up, 6 in Feb 1 05:01:29 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e260: 6 total, 6 up, 6 in Feb 1 05:01:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:01:30 localhost systemd[1]: tmp-crun.DNl4Eu.mount: Deactivated successfully. Feb 1 05:01:30 localhost podman[327269]: 2026-02-01 10:01:30.749630789 +0000 UTC m=+0.104063061 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 05:01:30 localhost podman[327269]: 2026-02-01 10:01:30.763410253 +0000 UTC m=+0.117842585 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 05:01:30 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:01:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:01:30 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/461966862' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:01:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:01:30 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/461966862' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:01:31 localhost openstack_network_exporter[239441]: ERROR 10:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:01:31 localhost openstack_network_exporter[239441]: Feb 1 05:01:31 localhost openstack_network_exporter[239441]: ERROR 10:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:01:31 localhost openstack_network_exporter[239441]: Feb 1 05:01:31 localhost nova_compute[274651]: 2026-02-01 10:01:31.818 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:31 localhost nova_compute[274651]: 2026-02-01 10:01:31.820 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:31 localhost nova_compute[274651]: 2026-02-01 10:01:31.821 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:01:31 localhost nova_compute[274651]: 2026-02-01 10:01:31.821 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:31 localhost nova_compute[274651]: 2026-02-01 10:01:31.855 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:31 localhost nova_compute[274651]: 2026-02-01 10:01:31.856 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e260 do_prune osdmap full prune enabled Feb 1 05:01:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e261 e261: 6 total, 6 up, 6 in Feb 1 05:01:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e261: 6 total, 6 up, 6 in Feb 1 05:01:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e261 do_prune osdmap full prune enabled Feb 1 05:01:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e262 e262: 6 total, 6 up, 6 in Feb 1 05:01:32 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e262: 6 total, 6 up, 6 in Feb 1 05:01:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e262 do_prune osdmap full prune enabled Feb 1 05:01:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e263 e263: 6 total, 6 up, 6 in Feb 1 05:01:33 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e263: 6 total, 6 up, 6 in Feb 1 05:01:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:01:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/212262338' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:01:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:01:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/212262338' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:01:35 localhost ovn_controller[152492]: 2026-02-01T10:01:35Z|00485|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Feb 1 05:01:36 localhost nova_compute[274651]: 2026-02-01 10:01:36.857 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:36 localhost nova_compute[274651]: 2026-02-01 10:01:36.859 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:36 localhost nova_compute[274651]: 2026-02-01 10:01:36.859 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:01:36 localhost nova_compute[274651]: 2026-02-01 10:01:36.859 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:36 localhost nova_compute[274651]: 2026-02-01 10:01:36.883 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:36 localhost nova_compute[274651]: 2026-02-01 10:01:36.884 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e263 do_prune osdmap full prune enabled Feb 1 05:01:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e264 e264: 6 total, 6 up, 6 in Feb 1 05:01:36 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e264: 6 total, 6 up, 6 in Feb 1 05:01:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:01:37 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3606390078' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:01:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:01:37 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3606390078' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:01:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:37 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e264 do_prune osdmap full prune enabled Feb 1 05:01:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e265 e265: 6 total, 6 up, 6 in Feb 1 05:01:37 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e265: 6 total, 6 up, 6 in Feb 1 05:01:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:01:38 localhost podman[327288]: 2026-02-01 10:01:38.724618078 +0000 UTC m=+0.082967943 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:01:38 localhost podman[327288]: 2026-02-01 10:01:38.759358146 +0000 UTC m=+0.117707931 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:01:38 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:01:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e265 do_prune osdmap full prune enabled Feb 1 05:01:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e266 e266: 6 total, 6 up, 6 in Feb 1 05:01:38 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e266: 6 total, 6 up, 6 in Feb 1 05:01:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e266 do_prune osdmap full prune enabled Feb 1 05:01:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e267 e267: 6 total, 6 up, 6 in Feb 1 05:01:39 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e267: 6 total, 6 up, 6 in Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0. Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.153724) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64 Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100153778, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2399, "num_deletes": 265, "total_data_size": 2527870, "memory_usage": 2583792, "flush_reason": "Manual Compaction"} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100169171, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2461212, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34568, "largest_seqno": 36966, "table_properties": {"data_size": 2450128, "index_size": 7013, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 27370, "raw_average_key_size": 22, "raw_value_size": 2426588, "raw_average_value_size": 2018, "num_data_blocks": 299, "num_entries": 1202, "num_filter_entries": 1202, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939991, "oldest_key_time": 1769939991, "file_creation_time": 1769940100, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 15516 microseconds, and 7209 cpu microseconds. Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.169234) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2461212 bytes OK Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.169264) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.171278) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.171303) EVENT_LOG_v1 {"time_micros": 1769940100171295, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.171327) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 2516908, prev total WAL file size 2517203, number of live WAL files 2. Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.172252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2403KB)], [63(20MB)] Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100172307, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 23916803, "oldest_snapshot_seqno": -1} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 13771 keys, 22547087 bytes, temperature: kUnknown Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100276839, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 22547087, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22465545, "index_size": 46049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34437, "raw_key_size": 370007, "raw_average_key_size": 26, "raw_value_size": 22228213, "raw_average_value_size": 1614, "num_data_blocks": 1725, "num_entries": 13771, "num_filter_entries": 13771, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769940100, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.277216) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 22547087 bytes Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.279076) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 228.6 rd, 215.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 20.5 +0.0 blob) out(21.5 +0.0 blob), read-write-amplify(18.9) write-amplify(9.2) OK, records in: 14313, records dropped: 542 output_compression: NoCompression Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.279105) EVENT_LOG_v1 {"time_micros": 1769940100279092, "job": 38, "event": "compaction_finished", "compaction_time_micros": 104615, "compaction_time_cpu_micros": 54515, "output_level": 6, "num_output_files": 1, "total_output_size": 22547087, "num_input_records": 14313, "num_output_records": 13771, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100279561, "job": 38, "event": "table_file_deletion", "file_number": 65} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100282898, "job": 38, "event": "table_file_deletion", "file_number": 63} Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.172122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.283014) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.283020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.283022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.283024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:01:40 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:01:40.283026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:01:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:01:40 localhost podman[327311]: 2026-02-01 10:01:40.696827418 +0000 UTC m=+0.056136878 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 1 05:01:40 localhost podman[327311]: 2026-02-01 10:01:40.729259075 +0000 UTC m=+0.088568515 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127) Feb 1 05:01:40 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:01:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:41 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:41.723 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:01:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:41.723 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:01:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:41.724 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:01:41 localhost nova_compute[274651]: 2026-02-01 10:01:41.885 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:41 localhost nova_compute[274651]: 2026-02-01 10:01:41.887 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:01:41 localhost nova_compute[274651]: 2026-02-01 10:01:41.887 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:01:41 localhost nova_compute[274651]: 2026-02-01 10:01:41.888 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e267 do_prune osdmap full prune enabled Feb 1 05:01:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e268 e268: 6 total, 6 up, 6 in Feb 1 05:01:41 localhost nova_compute[274651]: 2026-02-01 10:01:41.916 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:41 localhost nova_compute[274651]: 2026-02-01 10:01:41.917 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:01:41 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e268: 6 total, 6 up, 6 in Feb 1 05:01:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e268 do_prune osdmap full prune enabled Feb 1 05:01:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e269 e269: 6 total, 6 up, 6 in Feb 1 05:01:42 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e269: 6 total, 6 up, 6 in Feb 1 05:01:43 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:43.644 259320 INFO neutron.agent.linux.ip_lib [None req-ead0879e-8f73-4a1d-a695-30e14604ad25 - - - - - -] Device tapde3cb694-ac cannot be used as it has no MAC address#033[00m Feb 1 05:01:43 localhost nova_compute[274651]: 2026-02-01 10:01:43.670 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:43 localhost kernel: device tapde3cb694-ac entered promiscuous mode Feb 1 05:01:43 localhost nova_compute[274651]: 2026-02-01 10:01:43.679 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:43 localhost ovn_controller[152492]: 2026-02-01T10:01:43Z|00486|binding|INFO|Claiming lport de3cb694-acf4-483f-aae0-7fd2405ea0d5 for this chassis. Feb 1 05:01:43 localhost ovn_controller[152492]: 2026-02-01T10:01:43Z|00487|binding|INFO|de3cb694-acf4-483f-aae0-7fd2405ea0d5: Claiming unknown Feb 1 05:01:43 localhost NetworkManager[5964]: [1769940103.6827] manager: (tapde3cb694-ac): new Generic device (/org/freedesktop/NetworkManager/Devices/80) Feb 1 05:01:43 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:43.689 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ac827ffb-bff9-4af3-81fa-58c97ca0d85e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac827ffb-bff9-4af3-81fa-58c97ca0d85e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99be6110dc5844a985bf9d57e5e9ba74', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d759574c-646c-47ca-8e56-0f2188cd82b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=de3cb694-acf4-483f-aae0-7fd2405ea0d5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:01:43 localhost systemd-udevd[327339]: Network interface NamePolicy= disabled on kernel command line. Feb 1 05:01:43 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:43.691 158365 INFO neutron.agent.ovn.metadata.agent [-] Port de3cb694-acf4-483f-aae0-7fd2405ea0d5 in datapath ac827ffb-bff9-4af3-81fa-58c97ca0d85e bound to our chassis#033[00m Feb 1 05:01:43 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:43.694 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 62d55a0f-5e96-4ed5-bba0-36c4a4aae16b IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 05:01:43 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:43.695 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac827ffb-bff9-4af3-81fa-58c97ca0d85e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:01:43 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:43.696 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[9496fe46-3f49-4001-88b5-46a44ac09ae4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost ovn_controller[152492]: 2026-02-01T10:01:43Z|00488|binding|INFO|Setting lport de3cb694-acf4-483f-aae0-7fd2405ea0d5 ovn-installed in OVS Feb 1 05:01:43 localhost ovn_controller[152492]: 2026-02-01T10:01:43Z|00489|binding|INFO|Setting lport de3cb694-acf4-483f-aae0-7fd2405ea0d5 up in Southbound Feb 1 05:01:43 localhost nova_compute[274651]: 2026-02-01 10:01:43.724 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost journal[217584]: ethtool ioctl error on tapde3cb694-ac: No such device Feb 1 05:01:43 localhost nova_compute[274651]: 2026-02-01 10:01:43.764 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:43 localhost nova_compute[274651]: 2026-02-01 10:01:43.794 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:01:44 localhost podman[327416]: 2026-02-01 10:01:44.759210895 +0000 UTC m=+0.152993946 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 05:01:44 localhost podman[327416]: 2026-02-01 10:01:44.769287935 +0000 UTC m=+0.163070966 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 05:01:44 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:01:44 localhost podman[327455]: Feb 1 05:01:44 localhost podman[327455]: 2026-02-01 10:01:44.794112739 +0000 UTC m=+0.097329075 container create 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 05:01:44 localhost podman[327455]: 2026-02-01 10:01:44.745187574 +0000 UTC m=+0.048403930 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 05:01:44 localhost systemd[1]: Started libpod-conmon-4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7.scope. Feb 1 05:01:44 localhost systemd[1]: tmp-crun.bswvOz.mount: Deactivated successfully. Feb 1 05:01:44 localhost systemd[1]: Started libcrun container. Feb 1 05:01:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f4eced30c9e701c86edad4fae535f89ac8a2dd2f73fdaa558b2f753d4c695e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 05:01:44 localhost podman[327455]: 2026-02-01 10:01:44.893449644 +0000 UTC m=+0.196665980 container init 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 05:01:44 localhost podman[327455]: 2026-02-01 10:01:44.903674768 +0000 UTC m=+0.206891094 container start 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 05:01:44 localhost dnsmasq[327485]: started, version 2.85 cachesize 150 Feb 1 05:01:44 localhost dnsmasq[327485]: DNS service limited to local subnets Feb 1 05:01:44 localhost dnsmasq[327485]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 05:01:44 localhost dnsmasq[327485]: warning: no upstream servers configured Feb 1 05:01:44 localhost dnsmasq-dhcp[327485]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 05:01:44 localhost dnsmasq[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/addn_hosts - 0 addresses Feb 1 05:01:44 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/host Feb 1 05:01:44 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/opts Feb 1 05:01:44 localhost ovn_controller[152492]: 2026-02-01T10:01:44Z|00490|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:01:44 localhost nova_compute[274651]: 2026-02-01 10:01:44.975 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:45 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:45.133 259320 INFO neutron.agent.dhcp.agent [None req-75b4975c-c682-443d-8d09-d6f28064ab37 - - - - - -] DHCP configuration for ports {'b811ef52-2aaf-44b7-b92b-e457c9d2dfe5'} is completed#033[00m Feb 1 05:01:45 localhost nova_compute[274651]: 2026-02-01 10:01:45.274 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 05:01:45 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:01:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:46.022 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:01:45Z, description=, device_id=3177b873-e444-4c78-ba71-bb19476b9552, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bc4a5675-344d-42aa-8e86-bf28912c2e63, ip_allocation=immediate, mac_address=fa:16:3e:5a:a6:2c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:01:40Z, description=, dns_domain=, id=ac827ffb-bff9-4af3-81fa-58c97ca0d85e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1446371679-network, port_security_enabled=True, project_id=99be6110dc5844a985bf9d57e5e9ba74, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19859, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3409, status=ACTIVE, subnets=['c2f3328e-de5b-4864-ba67-79725a249c80'], tags=[], tenant_id=99be6110dc5844a985bf9d57e5e9ba74, updated_at=2026-02-01T10:01:41Z, vlan_transparent=None, network_id=ac827ffb-bff9-4af3-81fa-58c97ca0d85e, port_security_enabled=False, project_id=99be6110dc5844a985bf9d57e5e9ba74, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3432, status=DOWN, tags=[], tenant_id=99be6110dc5844a985bf9d57e5e9ba74, updated_at=2026-02-01T10:01:45Z on network ac827ffb-bff9-4af3-81fa-58c97ca0d85e#033[00m Feb 1 05:01:46 localhost dnsmasq[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/addn_hosts - 1 addresses Feb 1 05:01:46 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/host Feb 1 05:01:46 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/opts Feb 1 05:01:46 localhost podman[327551]: 2026-02-01 10:01:46.249236824 +0000 UTC m=+0.058519181 container kill 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:01:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:46.487 259320 INFO neutron.agent.dhcp.agent [None req-6c9845cc-1022-4de9-bea4-9261e047f895 - - - - - -] DHCP configuration for ports {'bc4a5675-344d-42aa-8e86-bf28912c2e63'} is completed#033[00m Feb 1 05:01:46 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 05:01:46 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:01:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 05:01:46 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:01:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:46.801 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:01:45Z, description=, device_id=3177b873-e444-4c78-ba71-bb19476b9552, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bc4a5675-344d-42aa-8e86-bf28912c2e63, ip_allocation=immediate, mac_address=fa:16:3e:5a:a6:2c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:01:40Z, description=, dns_domain=, id=ac827ffb-bff9-4af3-81fa-58c97ca0d85e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1446371679-network, port_security_enabled=True, project_id=99be6110dc5844a985bf9d57e5e9ba74, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=19859, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3409, status=ACTIVE, subnets=['c2f3328e-de5b-4864-ba67-79725a249c80'], tags=[], tenant_id=99be6110dc5844a985bf9d57e5e9ba74, updated_at=2026-02-01T10:01:41Z, vlan_transparent=None, network_id=ac827ffb-bff9-4af3-81fa-58c97ca0d85e, port_security_enabled=False, project_id=99be6110dc5844a985bf9d57e5e9ba74, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3432, status=DOWN, tags=[], tenant_id=99be6110dc5844a985bf9d57e5e9ba74, updated_at=2026-02-01T10:01:45Z on network ac827ffb-bff9-4af3-81fa-58c97ca0d85e#033[00m Feb 1 05:01:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e269 do_prune osdmap full prune enabled Feb 1 05:01:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e270 e270: 6 total, 6 up, 6 in Feb 1 05:01:46 localhost nova_compute[274651]: 2026-02-01 10:01:46.951 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:46 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e270: 6 total, 6 up, 6 in Feb 1 05:01:47 localhost dnsmasq[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/addn_hosts - 1 addresses Feb 1 05:01:47 localhost podman[327587]: 2026-02-01 10:01:47.019263378 +0000 UTC m=+0.049810733 container kill 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 05:01:47 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/host Feb 1 05:01:47 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/opts Feb 1 05:01:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:01:47 localhost systemd[1]: tmp-crun.A1ydYI.mount: Deactivated successfully. Feb 1 05:01:47 localhost podman[327602]: 2026-02-01 10:01:47.128087935 +0000 UTC m=+0.078743642 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 05:01:47 localhost podman[327602]: 2026-02-01 10:01:47.160147811 +0000 UTC m=+0.110803598 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller) Feb 1 05:01:47 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:01:47 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:01:47.258 259320 INFO neutron.agent.dhcp.agent [None req-3b1bcece-eff7-48ed-b205-6635ed40bbb0 - - - - - -] DHCP configuration for ports {'bc4a5675-344d-42aa-8e86-bf28912c2e63'} is completed#033[00m Feb 1 05:01:47 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:01:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e270 do_prune osdmap full prune enabled Feb 1 05:01:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e271 e271: 6 total, 6 up, 6 in Feb 1 05:01:47 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e271: 6 total, 6 up, 6 in Feb 1 05:01:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:48 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:01:48 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:01:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e271 do_prune osdmap full prune enabled Feb 1 05:01:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e272 e272: 6 total, 6 up, 6 in Feb 1 05:01:50 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e272: 6 total, 6 up, 6 in Feb 1 05:01:50 localhost nova_compute[274651]: 2026-02-01 10:01:50.120 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:50 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:50.120 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:01:50 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:50.122 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 05:01:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e272 do_prune osdmap full prune enabled Feb 1 05:01:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e273 e273: 6 total, 6 up, 6 in Feb 1 05:01:51 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e273: 6 total, 6 up, 6 in Feb 1 05:01:51 localhost nova_compute[274651]: 2026-02-01 10:01:51.987 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:52 localhost nova_compute[274651]: 2026-02-01 10:01:52.603 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:01:52 localhost systemd[1]: tmp-crun.XhGmn4.mount: Deactivated successfully. Feb 1 05:01:52 localhost podman[327634]: 2026-02-01 10:01:52.714041514 +0000 UTC m=+0.073227634 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 1 05:01:52 localhost podman[327634]: 2026-02-01 10:01:52.730683135 +0000 UTC m=+0.089869305 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 05:01:52 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:01:53 localhost podman[236886]: time="2026-02-01T10:01:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:01:53 localhost podman[236886]: @ - - [01/Feb/2026:10:01:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158361 "" "Go-http-client/1.1" Feb 1 05:01:54 localhost podman[236886]: @ - - [01/Feb/2026:10:01:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19321 "" "Go-http-client/1.1" Feb 1 05:01:55 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:55.124 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 05:01:55 localhost nova_compute[274651]: 2026-02-01 10:01:55.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:55 localhost nova_compute[274651]: 2026-02-01 10:01:55.503 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:55 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e61: np0005604215.uhhqtv(active, since 12m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.747 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.748 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.748 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.749 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 05:01:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:01:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e273 do_prune osdmap full prune enabled Feb 1 05:01:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e274 e274: 6 total, 6 up, 6 in Feb 1 05:01:56 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e274: 6 total, 6 up, 6 in Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.989 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:56 localhost nova_compute[274651]: 2026-02-01 10:01:56.992 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:57 localhost nova_compute[274651]: 2026-02-01 10:01:57.349 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 05:01:57 localhost nova_compute[274651]: 2026-02-01 10:01:57.375 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 05:01:57 localhost nova_compute[274651]: 2026-02-01 10:01:57.376 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 05:01:59 localhost nova_compute[274651]: 2026-02-01 10:01:59.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:59 localhost nova_compute[274651]: 2026-02-01 10:01:59.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:01:59 localhost nova_compute[274651]: 2026-02-01 10:01:59.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 05:01:59 localhost podman[327670]: 2026-02-01 10:01:59.657123183 +0000 UTC m=+0.053462445 container kill 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:01:59 localhost dnsmasq[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/addn_hosts - 0 addresses Feb 1 05:01:59 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/host Feb 1 05:01:59 localhost dnsmasq-dhcp[327485]: read /var/lib/neutron/dhcp/ac827ffb-bff9-4af3-81fa-58c97ca0d85e/opts Feb 1 05:01:59 localhost ovn_controller[152492]: 2026-02-01T10:01:59Z|00491|binding|INFO|Releasing lport de3cb694-acf4-483f-aae0-7fd2405ea0d5 from this chassis (sb_readonly=0) Feb 1 05:01:59 localhost ovn_controller[152492]: 2026-02-01T10:01:59Z|00492|binding|INFO|Setting lport de3cb694-acf4-483f-aae0-7fd2405ea0d5 down in Southbound Feb 1 05:01:59 localhost nova_compute[274651]: 2026-02-01 10:01:59.918 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:01:59 localhost kernel: device tapde3cb694-ac left promiscuous mode Feb 1 05:01:59 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:59.927 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-ac827ffb-bff9-4af3-81fa-58c97ca0d85e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac827ffb-bff9-4af3-81fa-58c97ca0d85e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99be6110dc5844a985bf9d57e5e9ba74', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d759574c-646c-47ca-8e56-0f2188cd82b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=de3cb694-acf4-483f-aae0-7fd2405ea0d5) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:01:59 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:59.930 158365 INFO neutron.agent.ovn.metadata.agent [-] Port de3cb694-acf4-483f-aae0-7fd2405ea0d5 in datapath ac827ffb-bff9-4af3-81fa-58c97ca0d85e unbound from our chassis#033[00m Feb 1 05:01:59 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:59.934 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac827ffb-bff9-4af3-81fa-58c97ca0d85e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:01:59 localhost ovn_metadata_agent[158360]: 2026-02-01 10:01:59.935 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[e002c339-0ccd-422c-bcf2-4451841633ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:01:59 localhost nova_compute[274651]: 2026-02-01 10:01:59.945 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:00 localhost nova_compute[274651]: 2026-02-01 10:02:00.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:00 localhost nova_compute[274651]: 2026-02-01 10:02:00.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:01 localhost ovn_controller[152492]: 2026-02-01T10:02:01Z|00493|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:02:01 localhost nova_compute[274651]: 2026-02-01 10:02:01.275 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:01 localhost openstack_network_exporter[239441]: ERROR 10:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:02:01 localhost openstack_network_exporter[239441]: Feb 1 05:02:01 localhost openstack_network_exporter[239441]: ERROR 10:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:02:01 localhost openstack_network_exporter[239441]: Feb 1 05:02:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:02:01 localhost systemd[1]: tmp-crun.LkOd2N.mount: Deactivated successfully. Feb 1 05:02:01 localhost podman[327692]: 2026-02-01 10:02:01.741923126 +0000 UTC m=+0.098687696 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:02:01 localhost podman[327692]: 2026-02-01 10:02:01.781279536 +0000 UTC m=+0.138044086 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 1 05:02:01 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:02:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:02 localhost nova_compute[274651]: 2026-02-01 10:02:02.027 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:02 localhost dnsmasq[327485]: exiting on receipt of SIGTERM Feb 1 05:02:02 localhost podman[327726]: 2026-02-01 10:02:02.050225078 +0000 UTC m=+0.091181255 container kill 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 05:02:02 localhost systemd[1]: libpod-4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7.scope: Deactivated successfully. Feb 1 05:02:02 localhost podman[327740]: 2026-02-01 10:02:02.112128183 +0000 UTC m=+0.049027929 container died 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 05:02:02 localhost podman[327740]: 2026-02-01 10:02:02.142532677 +0000 UTC m=+0.079432443 container cleanup 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 05:02:02 localhost systemd[1]: libpod-conmon-4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7.scope: Deactivated successfully. Feb 1 05:02:02 localhost podman[327742]: 2026-02-01 10:02:02.207261638 +0000 UTC m=+0.132039972 container remove 4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ac827ffb-bff9-4af3-81fa-58c97ca0d85e, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 05:02:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:02:02.246 259320 INFO neutron.agent.dhcp.agent [None req-9f83cffa-2a9b-4c4a-ba29-f53bec29188c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:02:02 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:02:02.265 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:02:02 localhost systemd[1]: var-lib-containers-storage-overlay-d8f4eced30c9e701c86edad4fae535f89ac8a2dd2f73fdaa558b2f753d4c695e-merged.mount: Deactivated successfully. Feb 1 05:02:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4654936c072c119eac5a9e70dcbd1992576d973011f66a57d3ad4230ef64a5a7-userdata-shm.mount: Deactivated successfully. Feb 1 05:02:02 localhost systemd[1]: run-netns-qdhcp\x2dac827ffb\x2dbff9\x2d4af3\x2d81fa\x2d58c97ca0d85e.mount: Deactivated successfully. Feb 1 05:02:03 localhost nova_compute[274651]: 2026-02-01 10:02:03.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.302 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.302 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.303 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.303 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.303 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:02:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:02:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1742243905' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.772 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.847 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:02:04 localhost nova_compute[274651]: 2026-02-01 10:02:04.847 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.040 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.041 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11187MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.041 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.042 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.156 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.156 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.157 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:02:05 localhost ovn_controller[152492]: 2026-02-01T10:02:05Z|00494|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.212 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.232 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:02:05 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1867099647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.601 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.389s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.607 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.628 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.630 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:02:05 localhost nova_compute[274651]: 2026-02-01 10:02:05.630 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:02:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:07 localhost nova_compute[274651]: 2026-02-01 10:02:07.076 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:02:09 localhost podman[327813]: 2026-02-01 10:02:09.738907301 +0000 UTC m=+0.088749721 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:02:09 localhost podman[327813]: 2026-02-01 10:02:09.776471106 +0000 UTC m=+0.126313496 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:02:09 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:02:10 localhost nova_compute[274651]: 2026-02-01 10:02:10.627 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:10 localhost nova_compute[274651]: 2026-02-01 10:02:10.649 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:02:11 localhost podman[327836]: 2026-02-01 10:02:11.727194986 +0000 UTC m=+0.086697388 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent) Feb 1 05:02:11 localhost podman[327836]: 2026-02-01 10:02:11.734424458 +0000 UTC m=+0.093926870 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 05:02:11 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:02:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:12 localhost nova_compute[274651]: 2026-02-01 10:02:12.078 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:12 localhost nova_compute[274651]: 2026-02-01 10:02:12.079 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:12 localhost nova_compute[274651]: 2026-02-01 10:02:12.080 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:12 localhost nova_compute[274651]: 2026-02-01 10:02:12.080 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:12 localhost nova_compute[274651]: 2026-02-01 10:02:12.111 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:12 localhost nova_compute[274651]: 2026-02-01 10:02:12.112 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:14 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:14 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:02:15 localhost podman[327854]: 2026-02-01 10:02:15.701643269 +0000 UTC m=+0.065121865 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 05:02:15 localhost podman[327854]: 2026-02-01 10:02:15.712375798 +0000 UTC m=+0.075854364 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 05:02:15 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:02:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:17 localhost nova_compute[274651]: 2026-02-01 10:02:17.113 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:02:17 localhost systemd[1]: tmp-crun.JUtFsB.mount: Deactivated successfully. Feb 1 05:02:17 localhost podman[327876]: 2026-02-01 10:02:17.72692394 +0000 UTC m=+0.091467384 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Feb 1 05:02:17 localhost podman[327876]: 2026-02-01 10:02:17.76361525 +0000 UTC m=+0.128158744 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 1 05:02:17 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:02:20 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:20 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:22 localhost nova_compute[274651]: 2026-02-01 10:02:22.115 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:22 localhost nova_compute[274651]: 2026-02-01 10:02:22.116 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:22 localhost nova_compute[274651]: 2026-02-01 10:02:22.116 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:22 localhost nova_compute[274651]: 2026-02-01 10:02:22.117 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:22 localhost nova_compute[274651]: 2026-02-01 10:02:22.118 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:22 localhost nova_compute[274651]: 2026-02-01 10:02:22.118 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:02:23 localhost podman[327901]: 2026-02-01 10:02:23.737559361 +0000 UTC m=+0.094053134 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git) Feb 1 05:02:23 localhost podman[327901]: 2026-02-01 10:02:23.755393339 +0000 UTC m=+0.111887182 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z) Feb 1 05:02:23 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:02:23 localhost podman[236886]: time="2026-02-01T10:02:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:02:23 localhost podman[236886]: @ - - [01/Feb/2026:10:02:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:02:24 localhost podman[236886]: @ - - [01/Feb/2026:10:02:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18845 "" "Go-http-client/1.1" Feb 1 05:02:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e274 do_prune osdmap full prune enabled Feb 1 05:02:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e275 e275: 6 total, 6 up, 6 in Feb 1 05:02:24 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e275: 6 total, 6 up, 6 in Feb 1 05:02:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:27 localhost nova_compute[274651]: 2026-02-01 10:02:27.119 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:27 localhost nova_compute[274651]: 2026-02-01 10:02:27.120 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:27 localhost nova_compute[274651]: 2026-02-01 10:02:27.120 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:27 localhost nova_compute[274651]: 2026-02-01 10:02:27.120 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:27 localhost nova_compute[274651]: 2026-02-01 10:02:27.121 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:27 localhost nova_compute[274651]: 2026-02-01 10:02:27.121 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:27 localhost nova_compute[274651]: 2026-02-01 10:02:27.123 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:28 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:31 localhost openstack_network_exporter[239441]: ERROR 10:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:02:31 localhost openstack_network_exporter[239441]: Feb 1 05:02:31 localhost openstack_network_exporter[239441]: ERROR 10:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:02:31 localhost openstack_network_exporter[239441]: Feb 1 05:02:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e275 do_prune osdmap full prune enabled Feb 1 05:02:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e276 e276: 6 total, 6 up, 6 in Feb 1 05:02:31 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e276: 6 total, 6 up, 6 in Feb 1 05:02:32 localhost nova_compute[274651]: 2026-02-01 10:02:32.124 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:32 localhost nova_compute[274651]: 2026-02-01 10:02:32.126 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:32 localhost nova_compute[274651]: 2026-02-01 10:02:32.127 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:32 localhost nova_compute[274651]: 2026-02-01 10:02:32.127 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:32 localhost nova_compute[274651]: 2026-02-01 10:02:32.146 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:32 localhost nova_compute[274651]: 2026-02-01 10:02:32.147 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:02:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:02:32 localhost systemd[1]: tmp-crun.BjVu7G.mount: Deactivated successfully. Feb 1 05:02:32 localhost podman[327919]: 2026-02-01 10:02:32.736086871 +0000 UTC m=+0.089956937 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:02:32 localhost podman[327919]: 2026-02-01 10:02:32.745818771 +0000 UTC m=+0.099688887 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 05:02:32 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:02:32 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:02:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:32 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:02:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1468791672' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:02:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:02:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1468791672' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:02:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 1 05:02:36 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 5563 writes, 37K keys, 5563 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.06 MB/s#012Cumulative WAL: 5563 writes, 5563 syncs, 1.00 writes per sync, written: 0.07 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2437 writes, 11K keys, 2437 commit groups, 1.0 writes per commit group, ingest: 12.20 MB, 0.02 MB/s#012Interval WAL: 2437 writes, 2437 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.1 0.1 0.0 1.0 0.0 64.7 0.83 0.14 19 0.044 0 0 0.0 0.0#012 L6 1/0 21.50 MB 0.0 0.4 0.1 0.3 0.4 0.0 0.0 6.7 207.7 190.6 1.88 0.89 18 0.105 221K 9300 0.0 0.0#012 Sum 1/0 21.50 MB 0.0 0.4 0.1 0.3 0.4 0.1 0.0 7.7 144.1 152.1 2.71 1.04 37 0.073 221K 9300 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 14.0 203.2 204.2 0.75 0.38 14 0.054 93K 3761 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.4 0.1 0.3 0.4 0.0 0.0 0.0 207.7 190.6 1.88 0.89 18 0.105 221K 9300 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.1 0.1 0.0 0.0 0.0 64.8 0.83 0.14 18 0.046 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.9 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.052, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.40 GB write, 0.34 MB/s write, 0.38 GB read, 0.33 MB/s read, 2.7 seconds#012Interval compaction: 0.15 GB write, 0.26 MB/s write, 0.15 GB read, 0.25 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x558a91fa9350#2 capacity: 304.00 MB usage: 33.70 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000271 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1570,32.24 MB,10.6043%) FilterBlock(37,642.67 KB,0.20645%) IndexBlock(37,852.31 KB,0.273795%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 1 05:02:36 localhost ovn_controller[152492]: 2026-02-01T10:02:36Z|00495|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory Feb 1 05:02:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:37 localhost nova_compute[274651]: 2026-02-01 10:02:37.148 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:37 localhost nova_compute[274651]: 2026-02-01 10:02:37.150 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:37 localhost nova_compute[274651]: 2026-02-01 10:02:37.150 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:37 localhost nova_compute[274651]: 2026-02-01 10:02:37.150 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:37 localhost nova_compute[274651]: 2026-02-01 10:02:37.190 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:37 localhost nova_compute[274651]: 2026-02-01 10:02:37.191 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:39 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:02:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:02:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:02:39 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:02:40 localhost ceph-osd[32376]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Feb 1 05:02:40 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:02:40 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:02:40 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:02:40 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:02:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:02:40 localhost podman[327939]: 2026-02-01 10:02:40.721116519 +0000 UTC m=+0.081739896 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:02:40 localhost podman[327939]: 2026-02-01 10:02:40.734604003 +0000 UTC m=+0.095227400 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 05:02:40 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:02:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:02:41.724 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:02:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:02:41.725 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:02:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:02:41.726 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:02:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:02:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:42 localhost nova_compute[274651]: 2026-02-01 10:02:42.192 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:42 localhost nova_compute[274651]: 2026-02-01 10:02:42.193 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:42 localhost nova_compute[274651]: 2026-02-01 10:02:42.193 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:42 localhost nova_compute[274651]: 2026-02-01 10:02:42.193 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:42 localhost nova_compute[274651]: 2026-02-01 10:02:42.220 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:42 localhost nova_compute[274651]: 2026-02-01 10:02:42.221 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:02:42 localhost podman[327962]: 2026-02-01 10:02:42.728446978 +0000 UTC m=+0.090673469 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 1 05:02:42 localhost podman[327962]: 2026-02-01 10:02:42.759883065 +0000 UTC m=+0.122109536 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible) Feb 1 05:02:42 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:02:42 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:02:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:42 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:42 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e276 do_prune osdmap full prune enabled Feb 1 05:02:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e277 e277: 6 total, 6 up, 6 in Feb 1 05:02:44 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e277: 6 total, 6 up, 6 in Feb 1 05:02:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e277 do_prune osdmap full prune enabled Feb 1 05:02:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e278 e278: 6 total, 6 up, 6 in Feb 1 05:02:45 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e278: 6 total, 6 up, 6 in Feb 1 05:02:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:02:45 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:02:45 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:02:45 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:02:46 localhost podman[327999]: 2026-02-01 10:02:46.013982872 +0000 UTC m=+0.092551177 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 05:02:46 localhost podman[327999]: 2026-02-01 10:02:46.024436964 +0000 UTC m=+0.103005309 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:02:46 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:02:46 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:02:46 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:02:46 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:02:46 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:02:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 05:02:46 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:02:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:47 localhost nova_compute[274651]: 2026-02-01 10:02:47.222 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:47 localhost nova_compute[274651]: 2026-02-01 10:02:47.224 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:47 localhost nova_compute[274651]: 2026-02-01 10:02:47.224 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:47 localhost nova_compute[274651]: 2026-02-01 10:02:47.224 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:47 localhost nova_compute[274651]: 2026-02-01 10:02:47.245 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:47 localhost nova_compute[274651]: 2026-02-01 10:02:47.246 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:47 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 05:02:47 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:02:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:02:48 localhost podman[328091]: 2026-02-01 10:02:48.722113257 +0000 UTC m=+0.083845820 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 05:02:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:02:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:48 localhost podman[328091]: 2026-02-01 10:02:48.786548639 +0000 UTC m=+0.148281202 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 1 05:02:48 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:02:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:02:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 05:02:51 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:02:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e278 do_prune osdmap full prune enabled Feb 1 05:02:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e279 e279: 6 total, 6 up, 6 in Feb 1 05:02:51 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e279: 6 total, 6 up, 6 in Feb 1 05:02:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:02:52 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:02:52 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:02:52 localhost nova_compute[274651]: 2026-02-01 10:02:52.247 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:52 localhost nova_compute[274651]: 2026-02-01 10:02:52.248 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:52 localhost nova_compute[274651]: 2026-02-01 10:02:52.249 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:52 localhost nova_compute[274651]: 2026-02-01 10:02:52.249 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:52 localhost nova_compute[274651]: 2026-02-01 10:02:52.285 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:52 localhost nova_compute[274651]: 2026-02-01 10:02:52.286 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:52 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:02:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:02:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:02:52 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:02:52 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:02:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:02:53 localhost podman[236886]: time="2026-02-01T10:02:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:02:53 localhost podman[236886]: @ - - [01/Feb/2026:10:02:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:02:54 localhost podman[236886]: @ - - [01/Feb/2026:10:02:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18842 "" "Go-http-client/1.1" Feb 1 05:02:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:02:54 localhost podman[328115]: 2026-02-01 10:02:54.720689087 +0000 UTC m=+0.081743445 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z) Feb 1 05:02:54 localhost podman[328115]: 2026-02-01 10:02:54.73835183 +0000 UTC m=+0.099406228 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.7, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 05:02:54 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:02:55 localhost nova_compute[274651]: 2026-02-01 10:02:55.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:02:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:02:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:02:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.338 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.338 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.338 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.339 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.895 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.928 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 05:02:56 localhost nova_compute[274651]: 2026-02-01 10:02:56.929 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 05:02:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:02:57 localhost nova_compute[274651]: 2026-02-01 10:02:57.286 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:57 localhost nova_compute[274651]: 2026-02-01 10:02:57.288 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:02:57 localhost nova_compute[274651]: 2026-02-01 10:02:57.289 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:02:57 localhost nova_compute[274651]: 2026-02-01 10:02:57.289 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:57 localhost nova_compute[274651]: 2026-02-01 10:02:57.333 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:02:57 localhost nova_compute[274651]: 2026-02-01 10:02:57.334 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:02:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:58 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:02:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:02:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:02:59 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:02:59 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:02:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:02:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:02:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:02:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:02:59 localhost nova_compute[274651]: 2026-02-01 10:02:59.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:02:59 localhost nova_compute[274651]: 2026-02-01 10:02:59.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 05:02:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:02:59 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:00 localhost nova_compute[274651]: 2026-02-01 10:03:00.265 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:00 localhost nova_compute[274651]: 2026-02-01 10:03:00.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:00 localhost nova_compute[274651]: 2026-02-01 10:03:00.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:01 localhost openstack_network_exporter[239441]: ERROR 10:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:03:01 localhost openstack_network_exporter[239441]: Feb 1 05:03:01 localhost openstack_network_exporter[239441]: ERROR 10:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:03:01 localhost openstack_network_exporter[239441]: Feb 1 05:03:01 localhost ovn_metadata_agent[158360]: 2026-02-01 10:03:01.629 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:03:01 localhost nova_compute[274651]: 2026-02-01 10:03:01.629 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:01 localhost ovn_metadata_agent[158360]: 2026-02-01 10:03:01.631 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 05:03:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:02 localhost nova_compute[274651]: 2026-02-01 10:03:02.375 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:02 localhost ovn_metadata_agent[158360]: 2026-02-01 10:03:02.632 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 05:03:02 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:02 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:02 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:02 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.531 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.532 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.562 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.564 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78eb8bc5-135d-4061-ba01-0923665bcb63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.532341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32a7b240-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '9cba32ffb77be150f9acbb4397369ce674cf9b862df7e5deb722fee959800148'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.532341', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32a7c35c-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '53b1470b9b4b8c9d8b5a7b073cdb6dd759c9647b89bba4c4ede930e686c202ec'}]}, 'timestamp': '2026-02-01 10:03:03.564528', '_unique_id': '711097d705f9425eb8a008b1e13d25c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.565 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.566 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.571 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d4059e0-9e26-42b6-9948-8584ebd4b874', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.566606', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32a8dcf6-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '4b80c242477d50fe7313d9cbf790a6630345a5d8ff39d4c152aba24587d0181c'}]}, 'timestamp': '2026-02-01 10:03:03.571759', '_unique_id': '97abe55d424f4c2b9dbec77b3260809d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.572 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.573 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.585 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dc74e8d-de85-42d9-b7e2-bdb33a7aa94d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.573378', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32aaebe0-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.792802119, 'message_signature': '81c48e70b6be8cd11cb2d566292273bb09da2dcc4584f580477b88d73d6f3487'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.573378', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32ab0120-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.792802119, 'message_signature': '6e3191098a5626018563c0eac1d0f3293bccee4a09bf56cf16aaefc364aeb593'}]}, 'timestamp': '2026-02-01 10:03:03.585862', '_unique_id': '9662645b7f0240a28dea6793564714a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.588 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '500c6f65-56dc-4e44-9ff8-96924991ad1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.588808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32ab95a4-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '3a187a758418a3eb90a8ec36b4b226a7bb9fc85d44df3d750a2263582c8d37c4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.588808', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32aba88c-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '412c2ab85f62871a5506fab683b6ad4e8f67d8a0d646119d1de746cbd14c9c41'}]}, 'timestamp': '2026-02-01 10:03:03.590191', '_unique_id': 'e7fc9b947ddc465388ecf1a32de02325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.592 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.592 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ee18e35-d45f-4427-8e55-156c25034d6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.592582', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32ac1dd0-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '9fd32f53a2fe1d400c769fdf9a4d9be6058223cb54fc5a7b9ee875f834d965f9'}]}, 'timestamp': '2026-02-01 10:03:03.593225', '_unique_id': '2d28227edcf34639822e45794f19130a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.594 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.595 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.596 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7412d655-df9e-483c-8ff8-1b6c03c2b62b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.596273', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32acae44-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '881c43d96ff0d0a5a9dd6a54b98ec670ce2cda37226e5e960236d1ed15433a1f'}]}, 'timestamp': '2026-02-01 10:03:03.596877', '_unique_id': 'd835943cbee64a29a3698a5495bb83e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.597 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.599 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.599 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '529ee939-7360-44fd-b3bb-3f33429a63db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.599480', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32ad2c20-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '252071f7cc53dfcf89b32ff64a492f1b2625d9b3527725e312754b4786de9ed2'}]}, 'timestamp': '2026-02-01 10:03:03.600194', '_unique_id': 'e3b40f57258a42a5b6d5ee654f3253d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.601 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.602 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.603 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd535a850-444e-4a72-9e39-5abb740ba0b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.603047', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32adb7f8-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': 'b673473cc216b66a0fc2140cbde8d775bc02088d13506bde60ee50527cb9fdc7'}]}, 'timestamp': '2026-02-01 10:03:03.603713', '_unique_id': 'bac6d582f0c54334ac5162046f558865'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.604 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.606 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.606 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf0f2018-6738-409d-8893-3ee40f839fdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.606414', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32ae3ae8-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '038da6b94050cf169a2bd2205b0394715d7f3e249fe0c451102f3bb9970b0ec0'}]}, 'timestamp': '2026-02-01 10:03:03.607162', '_unique_id': '1e066f937e8e46a691fb071bb51d05d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.609 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eea711f7-7507-493b-80ce-913de8f7be18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.609623', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32aeb806-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.792802119, 'message_signature': '4b0e4d211d2b7e30700710b32f3bb8009914871a32cd646f97263d2653d9c1d3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.609623', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32aecf94-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.792802119, 'message_signature': 'fd7f5eca027e6b6ba1a1b96a48d0dfc2bfea593764e55b6773bde1864150c14e'}]}, 'timestamp': '2026-02-01 10:03:03.610809', '_unique_id': '7daf6319a4c14e39823151177479936a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.611 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.613 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.613 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.614 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bd69159-06e1-4fa1-9fc8-16394fe65d6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.613478', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32af4eec-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.792802119, 'message_signature': '9e187eb3333b8bb3510c7aec32a9f08aeed597b68c45392be5f3d1f509d61047'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.613478', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32af66e8-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.792802119, 'message_signature': '9999aec319186a0a1d3750e5582f2b2e9b4e9d9a8148350ebb193b25e0e19c03'}]}, 'timestamp': '2026-02-01 10:03:03.614697', '_unique_id': '549c0cb7bae6403298b22c4150e98110'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.615 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.617 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.617 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73f483f4-7701-46cc-97c7-5427c54885ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.617544', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32afedf2-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '8bcf300d70a91b1455f70d87e1af014fafe23f547dd2d5798a998c4302462a43'}]}, 'timestamp': '2026-02-01 10:03:03.618242', '_unique_id': 'fcbd5cac821c4aba82355cdbfe829cb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.619 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.621 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e27ba72-62bc-4d90-b91d-b24db5b49b34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.620981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32b07538-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '105ec5b811605ab65fd0fa152ccf158d76935bd8ca4b5414a69597788daf57b3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.620981', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32b08a46-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '5c5780ee1f4492766b217a16f7d0c8d0cb3ed99631492d957eb2f0da799dddec'}]}, 'timestamp': '2026-02-01 10:03:03.622190', '_unique_id': '21f390518ea14997b67f6eeeb3b1a01a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.623 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.624 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e70514c-ca78-4b73-8130-453b3271c743', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.624349', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32b0f27e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '266566ac563b564f67ddca993190a080fbfbc28b780eabc13fc3cb3dc216d544'}]}, 'timestamp': '2026-02-01 10:03:03.624783', '_unique_id': '3178dd76508b4be9b5b38f6c6abb6536'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.625 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.626 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 05:03:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1781223-570a-4f29-838b-624538731f7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:03:03.626297', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '32b3ace4-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.86139097, 'message_signature': 'fcc4885658c16e4b624a9c5b78ab7b4cc51ccbdf582c29b4744560cdef6b30d0'}]}, 'timestamp': '2026-02-01 10:03:03.642830', '_unique_id': 'f17c9897d3df433fbbed4529f7d6b9c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.644 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95eb5ad8-146e-4114-aaa7-d7a6482568c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.644714', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32b40c52-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': 'f4f858acf79b8e02c305a00eedd7031d367d582041c7c385019f4de7f0658618'}]}, 'timestamp': '2026-02-01 10:03:03.645114', '_unique_id': 'b38b6f22b54c452fabfc5603412be38b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.646 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.646 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 18130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '313af3c5-6f3f-4354-98d5-7e9ac7e27a3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18130000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:03:03.646524', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '32b45220-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.86139097, 'message_signature': '680a50282726edb86784c9f3b32413a4e6aa49f00b9b48c042c24ae1e7cb41ab'}]}, 'timestamp': '2026-02-01 10:03:03.646820', '_unique_id': '576d87df0db247cab8eee51e981a17e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.647 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.648 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.648 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.648 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.648 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91bd08e5-2fe4-487e-97dd-304d243a2d6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.648370', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32b49a8c-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': 'b34ee581ff985344a9ef2f8809dc25cee053bff131ae076502074d906a1dde85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.648370', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32b4a572-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '7c8ac5469563a9b7d9ddd9b4d7ef5098515e099579ea7e0441b692bd40d09ee7'}]}, 'timestamp': '2026-02-01 10:03:03.648940', '_unique_id': 'ea4e84339cc64c558e9cf704c2f5d166'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.649 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.650 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.650 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.650 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8f4f8bc-9722-4984-a437-fba679ab7f32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.650408', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32b4ebae-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': 'd62a70754ebd0c78ae3c271cfd845cc421bb9da33a792521066881e7093f1dff'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.650408', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32b4f7de-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '7d9238b4044763cf3ee5244b63feb3c37576739650117bebb774470a5462fa32'}]}, 'timestamp': '2026-02-01 10:03:03.651074', '_unique_id': 'ba39398732e54e74bec3f2cbdeef2172'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.651 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.652 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.652 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57cadb89-7233-415d-a96a-9403223a5dde', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:03:03.652497', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '32b53b86-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.786044272, 'message_signature': '0d868aac7f8d7ecf8fec5af89688eec7d5c3ffe2720b89e889194b4c220d4050'}]}, 'timestamp': '2026-02-01 10:03:03.652801', '_unique_id': '285e8998f52d4bf38602d8f910836566'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.653 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.654 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.654 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.654 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.655 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9cff071-fa86-4985-9bab-5ecc15a02dae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:03:03.654732', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '32b59568-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '48535ffdc206114eaab8e5938078be2b7e36219bb4a1730cc43034c2c013b129'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:03:03.654732', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '32b5a59e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12277.751753747, 'message_signature': '6d35b761de5c34b9ffb6e64e733a01cc46a922ed4e05fd11e455b7839ff55754'}]}, 'timestamp': '2026-02-01 10:03:03.655576', '_unique_id': 'e6fbe1ccd4a944c7becbf18395b2c9fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:03:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:03:03.656 12 ERROR oslo_messaging.notify.messaging Feb 1 05:03:03 localhost podman[328138]: 2026-02-01 10:03:03.707862137 +0000 UTC m=+0.064673870 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Feb 1 05:03:03 localhost podman[328138]: 2026-02-01 10:03:03.721374723 +0000 UTC m=+0.078186446 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:03:03 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.297 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.297 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.298 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.298 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.298 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:03:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:03:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3906417371' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.757 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.818 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:03:04 localhost nova_compute[274651]: 2026-02-01 10:03:04.819 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.040 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.041 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11168MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.042 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.042 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.118 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.119 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.119 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.154 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:03:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:03:05 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/477873194' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.605 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.611 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.639 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.640 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:03:05 localhost nova_compute[274651]: 2026-02-01 10:03:05.641 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:03:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:03:05 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:05 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:05 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:06 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:06 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:06 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:06 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:06 localhost nova_compute[274651]: 2026-02-01 10:03:06.640 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:07 localhost nova_compute[274651]: 2026-02-01 10:03:07.377 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:07 localhost nova_compute[274651]: 2026-02-01 10:03:07.379 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:07 localhost nova_compute[274651]: 2026-02-01 10:03:07.379 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:07 localhost nova_compute[274651]: 2026-02-01 10:03:07.380 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:07 localhost nova_compute[274651]: 2026-02-01 10:03:07.419 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:07 localhost nova_compute[274651]: 2026-02-01 10:03:07.420 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:09 localhost nova_compute[274651]: 2026-02-01 10:03:09.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e279 do_prune osdmap full prune enabled Feb 1 05:03:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e280 e280: 6 total, 6 up, 6 in Feb 1 05:03:10 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e280: 6 total, 6 up, 6 in Feb 1 05:03:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:03:11 localhost podman[328201]: 2026-02-01 10:03:11.725220349 +0000 UTC m=+0.084817130 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:03:11 localhost podman[328201]: 2026-02-01 10:03:11.739658893 +0000 UTC m=+0.099255694 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 05:03:11 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:03:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:12 localhost nova_compute[274651]: 2026-02-01 10:03:12.421 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:12 localhost nova_compute[274651]: 2026-02-01 10:03:12.423 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:12 localhost nova_compute[274651]: 2026-02-01 10:03:12.423 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:12 localhost nova_compute[274651]: 2026-02-01 10:03:12.423 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:12 localhost nova_compute[274651]: 2026-02-01 10:03:12.461 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:12 localhost nova_compute[274651]: 2026-02-01 10:03:12.461 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:03:13 localhost podman[328225]: 2026-02-01 10:03:13.721492409 +0000 UTC m=+0.085370037 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:03:13 localhost podman[328225]: 2026-02-01 10:03:13.72511594 +0000 UTC m=+0.088993568 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Feb 1 05:03:13 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:03:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:14 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:15 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:15 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:15 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:15 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:03:16 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:16 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:16 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:16 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:16 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:16 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:16 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:03:16 localhost podman[328243]: 2026-02-01 10:03:16.719974984 +0000 UTC m=+0.080860418 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:03:16 localhost podman[328243]: 2026-02-01 10:03:16.754183636 +0000 UTC m=+0.115069090 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 05:03:16 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:03:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e280 do_prune osdmap full prune enabled Feb 1 05:03:16 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e281 e281: 6 total, 6 up, 6 in Feb 1 05:03:17 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e281: 6 total, 6 up, 6 in Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0. Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.013902) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67 Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197014071, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1779, "num_deletes": 259, "total_data_size": 2019439, "memory_usage": 2056144, "flush_reason": "Manual Compaction"} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197025686, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1603814, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36967, "largest_seqno": 38745, "table_properties": {"data_size": 1597125, "index_size": 3518, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18887, "raw_average_key_size": 22, "raw_value_size": 1581988, "raw_average_value_size": 1878, "num_data_blocks": 153, "num_entries": 842, "num_filter_entries": 842, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940100, "oldest_key_time": 1769940100, "file_creation_time": 1769940197, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 11733 microseconds, and 5141 cpu microseconds. Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.025739) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1603814 bytes OK Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.025763) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.028129) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.028152) EVENT_LOG_v1 {"time_micros": 1769940197028145, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.028176) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 2011239, prev total WAL file size 2011239, number of live WAL files 2. Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.029014) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303130' seq:72057594037927935, type:22 .. '6D6772737461740034323631' seq:0, type:0; will stop at (end) Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1566KB)], [66(21MB)] Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197029057, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 24150901, "oldest_snapshot_seqno": -1} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14115 keys, 22384835 bytes, temperature: kUnknown Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197131797, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 22384835, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22303684, "index_size": 44762, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35333, "raw_key_size": 378413, "raw_average_key_size": 26, "raw_value_size": 22063070, "raw_average_value_size": 1563, "num_data_blocks": 1670, "num_entries": 14115, "num_filter_entries": 14115, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769940197, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.132182) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 22384835 bytes Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.133900) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 234.9 rd, 217.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 21.5 +0.0 blob) out(21.3 +0.0 blob), read-write-amplify(29.0) write-amplify(14.0) OK, records in: 14613, records dropped: 498 output_compression: NoCompression Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.133928) EVENT_LOG_v1 {"time_micros": 1769940197133916, "job": 40, "event": "compaction_finished", "compaction_time_micros": 102827, "compaction_time_cpu_micros": 51864, "output_level": 6, "num_output_files": 1, "total_output_size": 22384835, "num_input_records": 14613, "num_output_records": 14115, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197134306, "job": 40, "event": "table_file_deletion", "file_number": 68} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197137264, "job": 40, "event": "table_file_deletion", "file_number": 66} Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.028909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.137356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.137363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.137367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.137371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:17 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:17.137375) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:17 localhost nova_compute[274651]: 2026-02-01 10:03:17.463 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:17 localhost nova_compute[274651]: 2026-02-01 10:03:17.464 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:17 localhost nova_compute[274651]: 2026-02-01 10:03:17.465 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:17 localhost nova_compute[274651]: 2026-02-01 10:03:17.465 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:17 localhost nova_compute[274651]: 2026-02-01 10:03:17.495 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:17 localhost nova_compute[274651]: 2026-02-01 10:03:17.495 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:03:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:19 localhost podman[328266]: 2026-02-01 10:03:19.726440224 +0000 UTC m=+0.085954845 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:03:19 localhost podman[328266]: 2026-02-01 10:03:19.768546139 +0000 UTC m=+0.128060730 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true) Feb 1 05:03:19 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:03:20 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:03:20 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:20 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:20 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:22 localhost nova_compute[274651]: 2026-02-01 10:03:22.497 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:22 localhost nova_compute[274651]: 2026-02-01 10:03:22.538 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:22 localhost nova_compute[274651]: 2026-02-01 10:03:22.539 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:22 localhost nova_compute[274651]: 2026-02-01 10:03:22.539 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:22 localhost nova_compute[274651]: 2026-02-01 10:03:22.540 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:22 localhost nova_compute[274651]: 2026-02-01 10:03:22.540 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:03:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:03:23 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:03:23 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:23 localhost podman[236886]: time="2026-02-01T10:03:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:03:23 localhost podman[236886]: @ - - [01/Feb/2026:10:03:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:03:24 localhost podman[236886]: @ - - [01/Feb/2026:10:03:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18831 "" "Go-http-client/1.1" Feb 1 05:03:24 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:03:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:03:24 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:03:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:03:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:03:25 localhost podman[328292]: 2026-02-01 10:03:25.727795968 +0000 UTC m=+0.086076887 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-type=git, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter) Feb 1 05:03:25 localhost podman[328292]: 2026-02-01 10:03:25.769406038 +0000 UTC m=+0.127686987 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, architecture=x86_64, distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git) Feb 1 05:03:25 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:03:26 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:03:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:26 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:26 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:27 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:27 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:27 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:27 localhost nova_compute[274651]: 2026-02-01 10:03:27.542 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:27 localhost nova_compute[274651]: 2026-02-01 10:03:27.544 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:27 localhost nova_compute[274651]: 2026-02-01 10:03:27.544 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:27 localhost nova_compute[274651]: 2026-02-01 10:03:27.544 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:27 localhost nova_compute[274651]: 2026-02-01 10:03:27.569 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:27 localhost nova_compute[274651]: 2026-02-01 10:03:27.569 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:03:29 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:03:29 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:03:29 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e281 do_prune osdmap full prune enabled Feb 1 05:03:30 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e282 e282: 6 total, 6 up, 6 in Feb 1 05:03:30 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e282: 6 total, 6 up, 6 in Feb 1 05:03:30 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:03:30 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:03:30 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:03:30 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:03:31 localhost openstack_network_exporter[239441]: ERROR 10:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:03:31 localhost openstack_network_exporter[239441]: Feb 1 05:03:31 localhost openstack_network_exporter[239441]: ERROR 10:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:03:31 localhost openstack_network_exporter[239441]: Feb 1 05:03:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:32 localhost nova_compute[274651]: 2026-02-01 10:03:32.570 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:32 localhost nova_compute[274651]: 2026-02-01 10:03:32.572 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:32 localhost nova_compute[274651]: 2026-02-01 10:03:32.572 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:32 localhost nova_compute[274651]: 2026-02-01 10:03:32.573 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:32 localhost nova_compute[274651]: 2026-02-01 10:03:32.595 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:32 localhost nova_compute[274651]: 2026-02-01 10:03:32.596 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:33 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:03:33 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:33 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:33 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:03:34 localhost podman[328313]: 2026-02-01 10:03:34.723267224 +0000 UTC m=+0.075775452 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 05:03:34 localhost podman[328313]: 2026-02-01 10:03:34.737208313 +0000 UTC m=+0.089716551 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 05:03:34 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:03:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e282 do_prune osdmap full prune enabled Feb 1 05:03:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e283 e283: 6 total, 6 up, 6 in Feb 1 05:03:35 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e283: 6 total, 6 up, 6 in Feb 1 05:03:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:03:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:03:36 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:03:36 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:36 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e283 do_prune osdmap full prune enabled Feb 1 05:03:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e284 e284: 6 total, 6 up, 6 in Feb 1 05:03:37 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e284: 6 total, 6 up, 6 in Feb 1 05:03:37 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:03:37 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:03:37 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:03:37 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:03:37 localhost sshd[328330]: main: sshd: ssh-rsa algorithm is disabled Feb 1 05:03:37 localhost nova_compute[274651]: 2026-02-01 10:03:37.631 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:37 localhost nova_compute[274651]: 2026-02-01 10:03:37.634 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:37 localhost nova_compute[274651]: 2026-02-01 10:03:37.634 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:37 localhost nova_compute[274651]: 2026-02-01 10:03:37.634 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:37 localhost nova_compute[274651]: 2026-02-01 10:03:37.635 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:37 localhost nova_compute[274651]: 2026-02-01 10:03:37.639 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:37 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:39 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:40 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:03:40 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:40 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:40 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:41 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:41 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:03:41.726 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:03:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:03:41.726 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:03:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:03:41.727 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:03:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e284 do_prune osdmap full prune enabled Feb 1 05:03:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e285 e285: 6 total, 6 up, 6 in Feb 1 05:03:42 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e285: 6 total, 6 up, 6 in Feb 1 05:03:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:03:42 localhost nova_compute[274651]: 2026-02-01 10:03:42.637 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:42 localhost nova_compute[274651]: 2026-02-01 10:03:42.639 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:42 localhost nova_compute[274651]: 2026-02-01 10:03:42.639 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:42 localhost nova_compute[274651]: 2026-02-01 10:03:42.639 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:42 localhost nova_compute[274651]: 2026-02-01 10:03:42.658 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:42 localhost nova_compute[274651]: 2026-02-01 10:03:42.659 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:42 localhost systemd[1]: tmp-crun.VlpHCe.mount: Deactivated successfully. Feb 1 05:03:42 localhost podman[328332]: 2026-02-01 10:03:42.732158986 +0000 UTC m=+0.086894884 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 05:03:42 localhost podman[328332]: 2026-02-01 10:03:42.76610923 +0000 UTC m=+0.120845128 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:03:42 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:03:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:03:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:03:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:03:42 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e285 do_prune osdmap full prune enabled Feb 1 05:03:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e286 e286: 6 total, 6 up, 6 in Feb 1 05:03:43 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e286: 6 total, 6 up, 6 in Feb 1 05:03:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:03:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:03:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:03:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:03:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:03:44 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:44 localhost podman[328356]: 2026-02-01 10:03:44.721234034 +0000 UTC m=+0.079303429 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 1 05:03:44 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:44 localhost podman[328356]: 2026-02-01 10:03:44.756602383 +0000 UTC m=+0.114671808 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 1 05:03:44 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:03:45 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:03:45 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:45 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:45 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:46 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:46 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:46 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:46 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:46 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:46 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:03:47 localhost podman[328392]: 2026-02-01 10:03:47.243686499 +0000 UTC m=+0.093250129 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 05:03:47 localhost podman[328392]: 2026-02-01 10:03:47.254253854 +0000 UTC m=+0.103817494 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 05:03:47 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:03:47 localhost nova_compute[274651]: 2026-02-01 10:03:47.660 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:47 localhost nova_compute[274651]: 2026-02-01 10:03:47.662 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:47 localhost nova_compute[274651]: 2026-02-01 10:03:47.662 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:47 localhost nova_compute[274651]: 2026-02-01 10:03:47.663 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:47 localhost nova_compute[274651]: 2026-02-01 10:03:47.688 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:47 localhost nova_compute[274651]: 2026-02-01 10:03:47.689 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 05:03:47 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:03:48 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 05:03:48 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:03:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0) Feb 1 05:03:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:03:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:03:48 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2018707573,client_metadata.root=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122],prefix=session evict} (starting...) Feb 1 05:03:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:48 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:03:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:03:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:03:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:03:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:03:49 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:49 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:49 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:50 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:50 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e286 do_prune osdmap full prune enabled Feb 1 05:03:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e287 e287: 6 total, 6 up, 6 in Feb 1 05:03:50 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e287: 6 total, 6 up, 6 in Feb 1 05:03:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:03:50 localhost podman[328485]: 2026-02-01 10:03:50.727229494 +0000 UTC m=+0.087739189 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:03:50 localhost podman[328485]: 2026-02-01 10:03:50.76939435 +0000 UTC m=+0.129904025 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller) Feb 1 05:03:50 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:03:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:51 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 05:03:51 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:03:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e287 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e287 do_prune osdmap full prune enabled Feb 1 05:03:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e288 e288: 6 total, 6 up, 6 in Feb 1 05:03:52 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e288: 6 total, 6 up, 6 in Feb 1 05:03:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:03:52 localhost nova_compute[274651]: 2026-02-01 10:03:52.690 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:52 localhost nova_compute[274651]: 2026-02-01 10:03:52.692 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:52 localhost nova_compute[274651]: 2026-02-01 10:03:52.693 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:52 localhost nova_compute[274651]: 2026-02-01 10:03:52.693 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:52 localhost nova_compute[274651]: 2026-02-01 10:03:52.714 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:52 localhost nova_compute[274651]: 2026-02-01 10:03:52.715 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:52 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:52 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0. Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.840208) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70 Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232840303, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 975, "num_deletes": 253, "total_data_size": 747996, "memory_usage": 766752, "flush_reason": "Manual Compaction"} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232849380, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 726391, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38746, "largest_seqno": 39720, "table_properties": {"data_size": 721755, "index_size": 2107, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12343, "raw_average_key_size": 21, "raw_value_size": 711657, "raw_average_value_size": 1235, "num_data_blocks": 92, "num_entries": 576, "num_filter_entries": 576, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940197, "oldest_key_time": 1769940197, "file_creation_time": 1769940232, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 9237 microseconds, and 4289 cpu microseconds. Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.849451) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 726391 bytes OK Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.849489) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.851615) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.851641) EVENT_LOG_v1 {"time_micros": 1769940232851633, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.851674) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 742994, prev total WAL file size 742994, number of live WAL files 2. Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.852426) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(709KB)], [69(21MB)] Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232852508, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 23111226, "oldest_snapshot_seqno": -1} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 14163 keys, 21263553 bytes, temperature: kUnknown Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232948543, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 21263553, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21183001, "index_size": 44035, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35461, "raw_key_size": 380468, "raw_average_key_size": 26, "raw_value_size": 20942472, "raw_average_value_size": 1478, "num_data_blocks": 1634, "num_entries": 14163, "num_filter_entries": 14163, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769940232, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.950372) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 21263553 bytes Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.952548) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 240.5 rd, 221.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 21.3 +0.0 blob) out(20.3 +0.0 blob), read-write-amplify(61.1) write-amplify(29.3) OK, records in: 14691, records dropped: 528 output_compression: NoCompression Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.952601) EVENT_LOG_v1 {"time_micros": 1769940232952579, "job": 42, "event": "compaction_finished", "compaction_time_micros": 96113, "compaction_time_cpu_micros": 54149, "output_level": 6, "num_output_files": 1, "total_output_size": 21263553, "num_input_records": 14691, "num_output_records": 14163, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232952968, "job": 42, "event": "table_file_deletion", "file_number": 71} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232956577, "job": 42, "event": "table_file_deletion", "file_number": 69} Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.852314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.956753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.956768) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.956772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.956775) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:03:52.956782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:03:53 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:53 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:53 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:53 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:53 localhost podman[236886]: time="2026-02-01T10:03:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:03:53 localhost podman[236886]: @ - - [01/Feb/2026:10:03:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:03:54 localhost podman[236886]: @ - - [01/Feb/2026:10:03:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18846 "" "Go-http-client/1.1" Feb 1 05:03:54 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:54 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:54 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:03:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:55 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:55 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 05:03:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:03:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.496 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.497 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.497 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.497 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 05:03:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:56 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:03:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:03:56 localhost podman[328510]: 2026-02-01 10:03:56.725485002 +0000 UTC m=+0.084332674 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, version=9.7, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal) Feb 1 05:03:56 localhost podman[328510]: 2026-02-01 10:03:56.743441834 +0000 UTC m=+0.102289546 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 05:03:56 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.956 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.976 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 05:03:56 localhost nova_compute[274651]: 2026-02-01 10:03:56.977 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 05:03:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e288 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:03:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e288 do_prune osdmap full prune enabled Feb 1 05:03:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e289 e289: 6 total, 6 up, 6 in Feb 1 05:03:57 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e289: 6 total, 6 up, 6 in Feb 1 05:03:57 localhost nova_compute[274651]: 2026-02-01 10:03:57.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:03:57 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:57 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:03:57 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:03:57 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:03:57 localhost nova_compute[274651]: 2026-02-01 10:03:57.716 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:57 localhost nova_compute[274651]: 2026-02-01 10:03:57.745 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:03:57 localhost nova_compute[274651]: 2026-02-01 10:03:57.746 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:03:57 localhost nova_compute[274651]: 2026-02-01 10:03:57.746 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:57 localhost nova_compute[274651]: 2026-02-01 10:03:57.747 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:03:57 localhost nova_compute[274651]: 2026-02-01 10:03:57.748 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:03:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0) Feb 1 05:03:57 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:03:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:03:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e289 do_prune osdmap full prune enabled Feb 1 05:03:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e290 e290: 6 total, 6 up, 6 in Feb 1 05:03:58 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e290: 6 total, 6 up, 6 in Feb 1 05:03:58 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2018707573,client_metadata.root=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659],prefix=session evict} (starting...) Feb 1 05:03:58 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:03:58 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:03:58 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:03:58 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:03:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:03:58 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:03:59 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:03:59 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:59 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:03:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:03:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:03:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e290 do_prune osdmap full prune enabled Feb 1 05:04:00 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e291 e291: 6 total, 6 up, 6 in Feb 1 05:04:00 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e291: 6 total, 6 up, 6 in Feb 1 05:04:00 localhost nova_compute[274651]: 2026-02-01 10:04:00.266 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:04:01 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:04:01 localhost nova_compute[274651]: 2026-02-01 10:04:01.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:01 localhost nova_compute[274651]: 2026-02-01 10:04:01.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 05:04:01 localhost openstack_network_exporter[239441]: ERROR 10:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:04:01 localhost openstack_network_exporter[239441]: Feb 1 05:04:01 localhost openstack_network_exporter[239441]: ERROR 10:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:04:01 localhost openstack_network_exporter[239441]: Feb 1 05:04:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.749 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.751 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.751 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.752 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.777 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:02 localhost nova_compute[274651]: 2026-02-01 10:04:02.778 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:04:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:02 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:02 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:03 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:03 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:04:03 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:03 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.288 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.288 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.288 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.289 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.289 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:04:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:04 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:04 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:04:04 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3350087035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.756 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.832 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:04:04 localhost nova_compute[274651]: 2026-02-01 10:04:04.833 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.084 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.086 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11150MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.086 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.087 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:04:05 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:05 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:05 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:05 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.148 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.148 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.149 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.185 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:04:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:04:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:04:05 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1664883928' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.692 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.698 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.721 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.724 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:04:05 localhost nova_compute[274651]: 2026-02-01 10:04:05.725 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:04:05 localhost systemd[1]: tmp-crun.qMl2h1.mount: Deactivated successfully. Feb 1 05:04:05 localhost podman[328572]: 2026-02-01 10:04:05.73330186 +0000 UTC m=+0.091412943 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 1 05:04:05 localhost podman[328572]: 2026-02-01 10:04:05.746297189 +0000 UTC m=+0.104408232 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Feb 1 05:04:05 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:04:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:06 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:06 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:06.433 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:04:06 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:06.435 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 05:04:06 localhost nova_compute[274651]: 2026-02-01 10:04:06.473 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:06 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:04:06 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:06 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:06 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:06 localhost nova_compute[274651]: 2026-02-01 10:04:06.726 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e291 do_prune osdmap full prune enabled Feb 1 05:04:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e292 e292: 6 total, 6 up, 6 in Feb 1 05:04:07 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e292: 6 total, 6 up, 6 in Feb 1 05:04:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0) Feb 1 05:04:07 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:07 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:07 localhost nova_compute[274651]: 2026-02-01 10:04:07.819 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:07 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2018707573,client_metadata.root=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc],prefix=session evict} (starting...) Feb 1 05:04:08 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:08 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:08 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:08 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:09 localhost nova_compute[274651]: 2026-02-01 10:04:09.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:04:09 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:09 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:09 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e292 do_prune osdmap full prune enabled Feb 1 05:04:10 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:10 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:04:10 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:10 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e293 e293: 6 total, 6 up, 6 in Feb 1 05:04:10 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e293: 6 total, 6 up, 6 in Feb 1 05:04:10 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:10.436 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 05:04:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:10 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:10 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:11 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:11 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:11 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:11 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:11 localhost nova_compute[274651]: 2026-02-01 10:04:11.265 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:11 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:11.320 259320 INFO neutron.agent.linux.ip_lib [None req-d1a5b758-0939-4274-8d11-5c1c9633d166 - - - - - -] Device tap7ca81b74-4c cannot be used as it has no MAC address#033[00m Feb 1 05:04:11 localhost nova_compute[274651]: 2026-02-01 10:04:11.351 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:11 localhost kernel: device tap7ca81b74-4c entered promiscuous mode Feb 1 05:04:11 localhost ovn_controller[152492]: 2026-02-01T10:04:11Z|00496|binding|INFO|Claiming lport 7ca81b74-4c9d-4558-96dc-af9238952a95 for this chassis. Feb 1 05:04:11 localhost ovn_controller[152492]: 2026-02-01T10:04:11Z|00497|binding|INFO|7ca81b74-4c9d-4558-96dc-af9238952a95: Claiming unknown Feb 1 05:04:11 localhost nova_compute[274651]: 2026-02-01 10:04:11.363 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:11 localhost NetworkManager[5964]: [1769940251.3637] manager: (tap7ca81b74-4c): new Generic device (/org/freedesktop/NetworkManager/Devices/81) Feb 1 05:04:11 localhost systemd-udevd[328604]: Network interface NamePolicy= disabled on kernel command line. Feb 1 05:04:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:11.375 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-022c2b60-1612-4d98-8994-449e57a8b11f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-022c2b60-1612-4d98-8994-449e57a8b11f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5569a72dd5a049758cc5843d905a3d74', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04e03501-b853-4a62-b799-e359e1d45707, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ca81b74-4c9d-4558-96dc-af9238952a95) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:04:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:11.377 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca81b74-4c9d-4558-96dc-af9238952a95 in datapath 022c2b60-1612-4d98-8994-449e57a8b11f bound to our chassis#033[00m Feb 1 05:04:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:11.378 158365 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 022c2b60-1612-4d98-8994-449e57a8b11f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 1 05:04:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:11.380 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[e94ce6d7-eeb3-4ee0-9d9a-9742f59d21d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost ovn_controller[152492]: 2026-02-01T10:04:11Z|00498|binding|INFO|Setting lport 7ca81b74-4c9d-4558-96dc-af9238952a95 ovn-installed in OVS Feb 1 05:04:11 localhost ovn_controller[152492]: 2026-02-01T10:04:11Z|00499|binding|INFO|Setting lport 7ca81b74-4c9d-4558-96dc-af9238952a95 up in Southbound Feb 1 05:04:11 localhost nova_compute[274651]: 2026-02-01 10:04:11.398 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost journal[217584]: ethtool ioctl error on tap7ca81b74-4c: No such device Feb 1 05:04:11 localhost nova_compute[274651]: 2026-02-01 10:04:11.471 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:11 localhost nova_compute[274651]: 2026-02-01 10:04:11.488 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:12 localhost podman[328675]: Feb 1 05:04:12 localhost podman[328675]: 2026-02-01 10:04:12.380846001 +0000 UTC m=+0.087444229 container create a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 05:04:12 localhost systemd[1]: Started libpod-conmon-a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf.scope. Feb 1 05:04:12 localhost podman[328675]: 2026-02-01 10:04:12.338143849 +0000 UTC m=+0.044742137 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 05:04:12 localhost systemd[1]: Started libcrun container. Feb 1 05:04:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a452331cc169b0bec8d9bd535ee7a3849117be051d994d3d4fab631a27d26ae1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 05:04:12 localhost podman[328675]: 2026-02-01 10:04:12.45398425 +0000 UTC m=+0.160582448 container init a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 05:04:12 localhost podman[328675]: 2026-02-01 10:04:12.463656947 +0000 UTC m=+0.170255155 container start a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:04:12 localhost dnsmasq[328693]: started, version 2.85 cachesize 150 Feb 1 05:04:12 localhost dnsmasq[328693]: DNS service limited to local subnets Feb 1 05:04:12 localhost dnsmasq[328693]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 05:04:12 localhost dnsmasq[328693]: warning: no upstream servers configured Feb 1 05:04:12 localhost dnsmasq-dhcp[328693]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 05:04:12 localhost dnsmasq[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/addn_hosts - 0 addresses Feb 1 05:04:12 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/host Feb 1 05:04:12 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/opts Feb 1 05:04:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:12 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:12 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:12 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:12.610 259320 INFO neutron.agent.dhcp.agent [None req-cbe0c806-1ebd-49da-9a0f-6136b624254b - - - - - -] DHCP configuration for ports {'ac12b8f6-c442-49a2-bf00-8c5fb3c58e4d'} is completed#033[00m Feb 1 05:04:12 localhost nova_compute[274651]: 2026-02-01 10:04:12.853 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:13 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:04:13 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:13 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:13 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:13 localhost nova_compute[274651]: 2026-02-01 10:04:13.259 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:04:13 localhost podman[328694]: 2026-02-01 10:04:13.453107232 +0000 UTC m=+0.059666296 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:04:13 localhost podman[328694]: 2026-02-01 10:04:13.457881338 +0000 UTC m=+0.064440392 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 05:04:13 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:04:13 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:13.636 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:04:13Z, description=, device_id=c1277150-2084-4fe9-b7f9-00741bcb7f54, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1984ba8f-353b-45ce-a031-b4d7f1971ecb, ip_allocation=immediate, mac_address=fa:16:3e:21:24:52, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:04:09Z, description=, dns_domain=, id=022c2b60-1612-4d98-8994-449e57a8b11f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-587625830-network, port_security_enabled=True, project_id=5569a72dd5a049758cc5843d905a3d74, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15567, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3721, status=ACTIVE, subnets=['c7e973c3-fcb8-4e3a-b6e5-e40ff54076e4'], tags=[], tenant_id=5569a72dd5a049758cc5843d905a3d74, updated_at=2026-02-01T10:04:10Z, vlan_transparent=None, network_id=022c2b60-1612-4d98-8994-449e57a8b11f, port_security_enabled=False, project_id=5569a72dd5a049758cc5843d905a3d74, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3728, status=DOWN, tags=[], tenant_id=5569a72dd5a049758cc5843d905a3d74, updated_at=2026-02-01T10:04:13Z on network 022c2b60-1612-4d98-8994-449e57a8b11f#033[00m Feb 1 05:04:14 localhost systemd[1]: tmp-crun.qeQ7Bw.mount: Deactivated successfully. Feb 1 05:04:14 localhost dnsmasq[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/addn_hosts - 1 addresses Feb 1 05:04:14 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/host Feb 1 05:04:14 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/opts Feb 1 05:04:14 localhost podman[328734]: 2026-02-01 10:04:14.427577796 +0000 UTC m=+0.128161262 container kill a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 1 05:04:14 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0) Feb 1 05:04:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:14 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:14 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:14 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:14 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:14 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:14 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2018707573,client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934],prefix=session evict} (starting...) Feb 1 05:04:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:14.664 259320 INFO neutron.agent.dhcp.agent [None req-7d2a3133-610d-4d84-9bc4-9daa8f005f43 - - - - - -] DHCP configuration for ports {'1984ba8f-353b-45ce-a031-b4d7f1971ecb'} is completed#033[00m Feb 1 05:04:14 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:14.937 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:04:13Z, description=, device_id=c1277150-2084-4fe9-b7f9-00741bcb7f54, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1984ba8f-353b-45ce-a031-b4d7f1971ecb, ip_allocation=immediate, mac_address=fa:16:3e:21:24:52, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:04:09Z, description=, dns_domain=, id=022c2b60-1612-4d98-8994-449e57a8b11f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPITest-587625830-network, port_security_enabled=True, project_id=5569a72dd5a049758cc5843d905a3d74, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15567, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3721, status=ACTIVE, subnets=['c7e973c3-fcb8-4e3a-b6e5-e40ff54076e4'], tags=[], tenant_id=5569a72dd5a049758cc5843d905a3d74, updated_at=2026-02-01T10:04:10Z, vlan_transparent=None, network_id=022c2b60-1612-4d98-8994-449e57a8b11f, port_security_enabled=False, project_id=5569a72dd5a049758cc5843d905a3d74, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3728, status=DOWN, tags=[], tenant_id=5569a72dd5a049758cc5843d905a3d74, updated_at=2026-02-01T10:04:13Z on network 022c2b60-1612-4d98-8994-449e57a8b11f#033[00m Feb 1 05:04:15 localhost dnsmasq[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/addn_hosts - 1 addresses Feb 1 05:04:15 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/host Feb 1 05:04:15 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/opts Feb 1 05:04:15 localhost podman[328771]: 2026-02-01 10:04:15.156571501 +0000 UTC m=+0.062884735 container kill a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 05:04:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:04:15 localhost systemd[1]: tmp-crun.JMJLyd.mount: Deactivated successfully. Feb 1 05:04:15 localhost podman[328786]: 2026-02-01 10:04:15.277424107 +0000 UTC m=+0.096155827 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 05:04:15 localhost podman[328786]: 2026-02-01 10:04:15.306641605 +0000 UTC m=+0.125373345 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 1 05:04:15 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:04:15 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:15.416 259320 INFO neutron.agent.dhcp.agent [None req-a16c6cbf-103c-4dd1-8083-28d5e63930a3 - - - - - -] DHCP configuration for ports {'1984ba8f-353b-45ce-a031-b4d7f1971ecb'} is completed#033[00m Feb 1 05:04:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:04:15 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:15 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:04:15 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:16 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:16 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:04:16 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:16 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:04:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e293 do_prune osdmap full prune enabled Feb 1 05:04:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 e294: 6 total, 6 up, 6 in Feb 1 05:04:17 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e294: 6 total, 6 up, 6 in Feb 1 05:04:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:04:17 localhost systemd[1]: tmp-crun.tsvPsx.mount: Deactivated successfully. Feb 1 05:04:17 localhost podman[328812]: 2026-02-01 10:04:17.736769658 +0000 UTC m=+0.098239491 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 05:04:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:17 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:17 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:17 localhost podman[328812]: 2026-02-01 10:04:17.775064105 +0000 UTC m=+0.136533988 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:04:17 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:04:17 localhost nova_compute[274651]: 2026-02-01 10:04:17.890 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:18 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:18 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:18 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:18 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:19 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:04:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:19 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:19 localhost podman[328850]: 2026-02-01 10:04:19.402268419 +0000 UTC m=+0.062627986 container kill a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 1 05:04:19 localhost dnsmasq[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/addn_hosts - 0 addresses Feb 1 05:04:19 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/host Feb 1 05:04:19 localhost dnsmasq-dhcp[328693]: read /var/lib/neutron/dhcp/022c2b60-1612-4d98-8994-449e57a8b11f/opts Feb 1 05:04:19 localhost nova_compute[274651]: 2026-02-01 10:04:19.594 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:19 localhost ovn_controller[152492]: 2026-02-01T10:04:19Z|00500|binding|INFO|Releasing lport 7ca81b74-4c9d-4558-96dc-af9238952a95 from this chassis (sb_readonly=0) Feb 1 05:04:19 localhost kernel: device tap7ca81b74-4c left promiscuous mode Feb 1 05:04:19 localhost ovn_controller[152492]: 2026-02-01T10:04:19Z|00501|binding|INFO|Setting lport 7ca81b74-4c9d-4558-96dc-af9238952a95 down in Southbound Feb 1 05:04:19 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:19.604 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-022c2b60-1612-4d98-8994-449e57a8b11f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-022c2b60-1612-4d98-8994-449e57a8b11f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5569a72dd5a049758cc5843d905a3d74', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04e03501-b853-4a62-b799-e359e1d45707, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ca81b74-4c9d-4558-96dc-af9238952a95) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:04:19 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:19.605 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca81b74-4c9d-4558-96dc-af9238952a95 in datapath 022c2b60-1612-4d98-8994-449e57a8b11f unbound from our chassis#033[00m Feb 1 05:04:19 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:19.607 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 022c2b60-1612-4d98-8994-449e57a8b11f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:04:19 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:19.608 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[ceecb4da-f56a-4639-b6f2-26257efb96b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:04:19 localhost nova_compute[274651]: 2026-02-01 10:04:19.620 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:20 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:20 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:20 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:20 localhost ovn_controller[152492]: 2026-02-01T10:04:20Z|00502|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:04:20 localhost nova_compute[274651]: 2026-02-01 10:04:20.638 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:21 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0) Feb 1 05:04:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:21 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:21 localhost podman[328889]: 2026-02-01 10:04:21.073514248 +0000 UTC m=+0.067825566 container kill a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 05:04:21 localhost dnsmasq[328693]: exiting on receipt of SIGTERM Feb 1 05:04:21 localhost systemd[1]: libpod-a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf.scope: Deactivated successfully. Feb 1 05:04:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:04:21 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:21 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:21 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:21 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:21 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2018707573,client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934],prefix=session evict} (starting...) Feb 1 05:04:21 localhost podman[328905]: 2026-02-01 10:04:21.176533296 +0000 UTC m=+0.073274905 container died a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Feb 1 05:04:21 localhost systemd[1]: tmp-crun.s2R5Hh.mount: Deactivated successfully. Feb 1 05:04:21 localhost podman[328911]: 2026-02-01 10:04:21.177606399 +0000 UTC m=+0.073020477 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:04:21 localhost podman[328905]: 2026-02-01 10:04:21.280541304 +0000 UTC m=+0.177282873 container remove a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-022c2b60-1612-4d98-8994-449e57a8b11f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:04:21 localhost systemd[1]: libpod-conmon-a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf.scope: Deactivated successfully. Feb 1 05:04:21 localhost podman[328911]: 2026-02-01 10:04:21.308871735 +0000 UTC m=+0.204285863 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 05:04:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:21.312 259320 INFO neutron.agent.dhcp.agent [None req-38739e5d-10c6-4768-b143-e9a044fe2be1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:04:21 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:04:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:04:21.427 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:04:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:22 localhost systemd[1]: var-lib-containers-storage-overlay-a452331cc169b0bec8d9bd535ee7a3849117be051d994d3d4fab631a27d26ae1-merged.mount: Deactivated successfully. Feb 1 05:04:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0511f01d614f34f9c6b1384a2fbed577c9da88b8e41d331dd3ce597b040dcbf-userdata-shm.mount: Deactivated successfully. Feb 1 05:04:22 localhost systemd[1]: run-netns-qdhcp\x2d022c2b60\x2d1612\x2d4d98\x2d8994\x2d449e57a8b11f.mount: Deactivated successfully. Feb 1 05:04:22 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e62: np0005604215.uhhqtv(active, since 15m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh Feb 1 05:04:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:04:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:22 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:04:22 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:04:22 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:04:22 localhost nova_compute[274651]: 2026-02-01 10:04:22.935 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:23 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:23 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:04:23 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:23 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:04:23 localhost podman[236886]: time="2026-02-01T10:04:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:04:23 localhost podman[236886]: @ - - [01/Feb/2026:10:04:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:04:23 localhost podman[236886]: @ - - [01/Feb/2026:10:04:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18834 "" "Go-http-client/1.1" Feb 1 05:04:24 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:24 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:24 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:24 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:24 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:24 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : mgrmap e63: np0005604215.uhhqtv(active, since 15m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh Feb 1 05:04:25 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:25 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:26 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:04:26 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:26 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:26 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:04:27 localhost podman[328959]: 2026-02-01 10:04:27.720038048 +0000 UTC m=+0.081501157 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, version=9.7, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 05:04:27 localhost podman[328959]: 2026-02-01 10:04:27.736449883 +0000 UTC m=+0.097912982 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1769056855, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z) Feb 1 05:04:27 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:04:27 localhost nova_compute[274651]: 2026-02-01 10:04:27.938 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:27 localhost nova_compute[274651]: 2026-02-01 10:04:27.940 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:27 localhost nova_compute[274651]: 2026-02-01 10:04:27.940 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:27 localhost nova_compute[274651]: 2026-02-01 10:04:27.940 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:27 localhost nova_compute[274651]: 2026-02-01 10:04:27.965 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:27 localhost nova_compute[274651]: 2026-02-01 10:04:27.965 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:28 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0) Feb 1 05:04:28 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:28 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:28 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2018707573,client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934],prefix=session evict} (starting...) Feb 1 05:04:29 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:29 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:29 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:29 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:04:29 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:04:29 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:04:29 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:04:29 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:04:30 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:04:30 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:04:30 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:04:30 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:04:31 localhost openstack_network_exporter[239441]: ERROR 10:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:04:31 localhost openstack_network_exporter[239441]: Feb 1 05:04:31 localhost openstack_network_exporter[239441]: ERROR 10:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:04:31 localhost openstack_network_exporter[239441]: Feb 1 05:04:31 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:31 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:31 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:32 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:32 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:32 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:32 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:32 localhost nova_compute[274651]: 2026-02-01 10:04:32.967 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:32 localhost nova_compute[274651]: 2026-02-01 10:04:32.968 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:32 localhost nova_compute[274651]: 2026-02-01 10:04:32.968 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:32 localhost nova_compute[274651]: 2026-02-01 10:04:32.968 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:32 localhost nova_compute[274651]: 2026-02-01 10:04:32.969 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:32 localhost nova_compute[274651]: 2026-02-01 10:04:32.972 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:33 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:04:33 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:33 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:33 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0) Feb 1 05:04:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:35 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch Feb 1 05:04:35 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch Feb 1 05:04:35 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished Feb 1 05:04:35 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=tempest-cephx-id-2018707573,client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934],prefix=session evict} (starting...) Feb 1 05:04:35 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:04:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:04:35 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:04:35 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:04:36 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:04:36 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:04:36 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:04:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:04:36 localhost podman[328980]: 2026-02-01 10:04:36.732197528 +0000 UTC m=+0.091887245 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true) Feb 1 05:04:36 localhost podman[328980]: 2026-02-01 10:04:36.775358646 +0000 UTC m=+0.135048323 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 05:04:36 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:04:36 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:04:36 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:04:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:37 localhost nova_compute[274651]: 2026-02-01 10:04:37.973 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:37 localhost nova_compute[274651]: 2026-02-01 10:04:37.975 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:37 localhost nova_compute[274651]: 2026-02-01 10:04:37.975 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:37 localhost nova_compute[274651]: 2026-02-01 10:04:37.976 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:38 localhost nova_compute[274651]: 2026-02-01 10:04:38.012 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:38 localhost nova_compute[274651]: 2026-02-01 10:04:38.013 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:38 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:38 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:38 localhost ceph-osd[31431]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Feb 1 05:04:39 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:04:39 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:39 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:39 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:41.727 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:04:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:41.727 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:04:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:04:41.728 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:04:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:04:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:42 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:42 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:43 localhost nova_compute[274651]: 2026-02-01 10:04:43.014 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:43 localhost nova_compute[274651]: 2026-02-01 10:04:43.016 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:43 localhost nova_compute[274651]: 2026-02-01 10:04:43.016 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:43 localhost nova_compute[274651]: 2026-02-01 10:04:43.016 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:43 localhost nova_compute[274651]: 2026-02-01 10:04:43.046 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:43 localhost nova_compute[274651]: 2026-02-01 10:04:43.046 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:04:43 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:43 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:43 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:04:43 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:04:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:04:43 localhost systemd[1]: tmp-crun.ZVZo58.mount: Deactivated successfully. Feb 1 05:04:43 localhost podman[328999]: 2026-02-01 10:04:43.712240365 +0000 UTC m=+0.073331996 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:04:43 localhost podman[328999]: 2026-02-01 10:04:43.723300835 +0000 UTC m=+0.084392486 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:04:43 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:04:45 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:45 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:45 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:04:45 localhost podman[329023]: 2026-02-01 10:04:45.715352708 +0000 UTC m=+0.081805166 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:04:45 localhost podman[329023]: 2026-02-01 10:04:45.724504799 +0000 UTC m=+0.090957267 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:04:45 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:04:46 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:04:46 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:46 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:46 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0. Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.077421) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287077501, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1406, "num_deletes": 261, "total_data_size": 1808268, "memory_usage": 1835056, "flush_reason": "Manual Compaction"} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287089479, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1775829, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39721, "largest_seqno": 41126, "table_properties": {"data_size": 1769395, "index_size": 3391, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 16685, "raw_average_key_size": 21, "raw_value_size": 1755307, "raw_average_value_size": 2244, "num_data_blocks": 147, "num_entries": 782, "num_filter_entries": 782, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940233, "oldest_key_time": 1769940233, "file_creation_time": 1769940287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 12118 microseconds, and 4023 cpu microseconds. Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.089543) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1775829 bytes OK Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.089580) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.091751) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.091774) EVENT_LOG_v1 {"time_micros": 1769940287091767, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.091805) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 1801285, prev total WAL file size 1801609, number of live WAL files 2. Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.092626) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323634' seq:72057594037927935, type:22 .. '6C6F676D0034353136' seq:0, type:0; will stop at (end) Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(1734KB)], [72(20MB)] Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287092670, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 23039382, "oldest_snapshot_seqno": -1} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 14400 keys, 22821350 bytes, temperature: kUnknown Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287200801, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 22821350, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22737501, "index_size": 46756, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36037, "raw_key_size": 387390, "raw_average_key_size": 26, "raw_value_size": 22491199, "raw_average_value_size": 1561, "num_data_blocks": 1741, "num_entries": 14400, "num_filter_entries": 14400, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769940287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.201187) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 22821350 bytes Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.202967) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 212.8 rd, 210.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 20.3 +0.0 blob) out(21.8 +0.0 blob), read-write-amplify(25.8) write-amplify(12.9) OK, records in: 14945, records dropped: 545 output_compression: NoCompression Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.203003) EVENT_LOG_v1 {"time_micros": 1769940287202978, "job": 44, "event": "compaction_finished", "compaction_time_micros": 108247, "compaction_time_cpu_micros": 57440, "output_level": 6, "num_output_files": 1, "total_output_size": 22821350, "num_input_records": 14945, "num_output_records": 14400, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287203309, "job": 44, "event": "table_file_deletion", "file_number": 74} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287205333, "job": 44, "event": "table_file_deletion", "file_number": 72} Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.092503) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.205383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.205390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.205394) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.205397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:04:47 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:04:47.205400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:04:48 localhost nova_compute[274651]: 2026-02-01 10:04:48.047 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:48 localhost nova_compute[274651]: 2026-02-01 10:04:48.049 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:48 localhost nova_compute[274651]: 2026-02-01 10:04:48.050 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:48 localhost nova_compute[274651]: 2026-02-01 10:04:48.050 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:48 localhost nova_compute[274651]: 2026-02-01 10:04:48.051 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:48 localhost nova_compute[274651]: 2026-02-01 10:04:48.053 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:04:48 localhost podman[329058]: 2026-02-01 10:04:48.43326651 +0000 UTC m=+0.070844459 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 05:04:48 localhost podman[329058]: 2026-02-01 10:04:48.446335552 +0000 UTC m=+0.083913501 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 05:04:48 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:04:48 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 1 05:04:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:48 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:48 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 1 05:04:49 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 1 05:04:49 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 1 05:04:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 05:04:49 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:04:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:04:49 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:04:50 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 05:04:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:04:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:04:51 localhost ovn_controller[152492]: 2026-02-01T10:04:51Z|00503|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Feb 1 05:04:51 localhost podman[329151]: 2026-02-01 10:04:51.73714586 +0000 UTC m=+0.088024438 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 05:04:51 localhost podman[329151]: 2026-02-01 10:04:51.77846952 +0000 UTC m=+0.129348128 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 05:04:51 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:04:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 05:04:51 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:04:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:51 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:51 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:04:52 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:04:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:52 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:53 localhost nova_compute[274651]: 2026-02-01 10:04:53.055 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:53 localhost nova_compute[274651]: 2026-02-01 10:04:53.056 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:53 localhost nova_compute[274651]: 2026-02-01 10:04:53.056 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:53 localhost nova_compute[274651]: 2026-02-01 10:04:53.056 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:53 localhost nova_compute[274651]: 2026-02-01 10:04:53.073 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:53 localhost nova_compute[274651]: 2026-02-01 10:04:53.074 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:53 localhost podman[236886]: time="2026-02-01T10:04:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:04:53 localhost podman[236886]: @ - - [01/Feb/2026:10:04:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:04:54 localhost podman[236886]: @ - - [01/Feb/2026:10:04:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18842 "" "Go-http-client/1.1" Feb 1 05:04:55 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:04:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:55 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:04:55 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:04:56 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:56 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:04:56 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:04:56 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:04:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:04:57 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:04:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.372 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.373 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.373 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.373 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.930 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.944 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.944 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 05:04:57 localhost nova_compute[274651]: 2026-02-01 10:04:57.944 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:04:58 localhost nova_compute[274651]: 2026-02-01 10:04:58.075 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:58 localhost nova_compute[274651]: 2026-02-01 10:04:58.077 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:04:58 localhost nova_compute[274651]: 2026-02-01 10:04:58.077 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:04:58 localhost nova_compute[274651]: 2026-02-01 10:04:58.077 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:58 localhost nova_compute[274651]: 2026-02-01 10:04:58.113 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:04:58 localhost nova_compute[274651]: 2026-02-01 10:04:58.114 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:04:58 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:04:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:58 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:04:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:04:58 localhost podman[329176]: 2026-02-01 10:04:58.725167761 +0000 UTC m=+0.078942628 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, release=1769056855, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal) Feb 1 05:04:58 localhost podman[329176]: 2026-02-01 10:04:58.763058416 +0000 UTC m=+0.116833323 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, version=9.7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, release=1769056855, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container) Feb 1 05:04:58 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:04:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:04:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:59 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:04:59 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:05:01 localhost openstack_network_exporter[239441]: ERROR 10:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:05:01 localhost openstack_network_exporter[239441]: Feb 1 05:05:01 localhost openstack_network_exporter[239441]: ERROR 10:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:05:01 localhost openstack_network_exporter[239441]: Feb 1 05:05:01 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 1 05:05:01 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:05:01 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:05:01 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:05:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:02 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:05:02 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 1 05:05:02 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 1 05:05:02 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 1 05:05:02 localhost nova_compute[274651]: 2026-02-01 10:05:02.940 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.159 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.161 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.161 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5047 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.161 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.163 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.164 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:03 localhost nova_compute[274651]: 2026-02-01 10:05:03.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.532 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.532 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.532 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.536 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20181101-b1eb-4dd5-b119-e801f9cf0a1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.533002', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a2a22b0-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': '91e9f3ad8b8d530acc84a02e12e47374920827c3785f6f37c518f1289883711a'}]}, 'timestamp': '2026-02-01 10:05:03.537205', '_unique_id': 'c6a609b565714717af97071bd60656a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.538 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.545 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.545 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '731f6dde-d0f5-4e3d-905b-f0ef287be3c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.539052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a2b6efe-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.75845915, 'message_signature': '729cfbb9fc0617246489f90535caffd95868b3d2574a865d83b2460d4f301bc1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.539052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a2b7782-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.75845915, 'message_signature': 'f7af8b5ca27f2d55c3ee40a51b43dfe077a2125daec13fffe050bf99a9ce8691'}]}, 'timestamp': '2026-02-01 10:05:03.545841', '_unique_id': 'cf9e1d5c288642e2ba0af7fa9aef4163'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.546 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5ee448f-e5a3-49af-b445-47230f468e52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.547118', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a2bb1b6-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': '6f164469dbf2168b2c88d07ec0c028325ab71f0ae2eea5da6d31c835fb4a6f22'}]}, 'timestamp': '2026-02-01 10:05:03.547341', '_unique_id': 'bda4f53724b2418cbe9ae415ae67f398'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.547 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.548 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.548 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c69f357d-5802-4717-b881-379f21fe9733', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.548405', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a2be3c0-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': '88298a1b76890e030adabf7c21be187586eac8fac0ac8517596063c3bfd32498'}]}, 'timestamp': '2026-02-01 10:05:03.548622', '_unique_id': '4321ca89715e48118a1ea119ff0cf985'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.549 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a2b7cb0-f3e5-4a8c-aa34-b0a5b321a74a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.549588', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a2c11e2-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': 'aecea6bcf722350a655fe95f7ad910a1eae9da00f44d96ca0b2c3ccb21fbb1c7'}]}, 'timestamp': '2026-02-01 10:05:03.549803', '_unique_id': '0763e3c348a14ef5adc66c3a0643b61d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.550 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd30f13ee-71b4-4d0b-9387-1e7697ff546e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.550866', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a2c44e6-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': '72a1a3b31b433e01309869c9108fe42b15fe2cb65a70f8e48e1d2753350ea937'}]}, 'timestamp': '2026-02-01 10:05:03.551130', '_unique_id': '938e0bdf229b4bf38262d7c2bb0adb9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.551 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd13eaf2-db11-4589-b440-763767d7f38f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.552166', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a2c766e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': 'd9f6b8ca073da0379946acfd3ce22c3d84a066a51dd1a66025cf71be45694382'}]}, 'timestamp': '2026-02-01 10:05:03.552375', '_unique_id': '57c65a457d1246e9a6aeed9924446859'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.552 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.553 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.562 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 18760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3b122df-c618-4b1b-82ae-bb05970987c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18760000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:05:03.553537', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7a2e0a60-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.781786267, 'message_signature': '0616304e5ed6067acd66198102d2c820c7a83d50524ec960ccd9d37da9d1f5e9'}]}, 'timestamp': '2026-02-01 10:05:03.562791', '_unique_id': 'aedfbaab96184b4ea17576930eea546c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.563 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80410753-34ad-4879-9d6f-44f5cf8b7e1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.563892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a2e4214-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.75845915, 'message_signature': 'd421e1769818ebeb747e87d7d44af1b3150a27b8b8d9a9fc4b6a239f230f42b0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.563892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a2e4ade-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.75845915, 'message_signature': '803e409774a10d0069fa9d014f81563ebcef513a8d1a3d44b64bf03f3204b0cf'}]}, 'timestamp': '2026-02-01 10:05:03.564355', '_unique_id': '904183ccb3464a25bf536dec275684e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.564 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.565 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3ac8da1-ec6e-413b-bf88-73e8c89939df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.565338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a30c0fc-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': '6012e916d29ee8e1bcbf7a96f2ad49955ea4e7d73ca18ba8ed63125dc32e1b21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.565338', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a30c98a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': '31b469a66f45227352468933bf7400df2ac3e350e377cf3890a09d05ca4e0312'}]}, 'timestamp': '2026-02-01 10:05:03.580706', '_unique_id': '671d332e91444b368f19b6ca58b43465'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.581 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4555818a-634a-43f6-bf2d-882871c661d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.581787', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a30fbbc-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': '390d193567eff2f04f8b4e5161df76af4d1854a98274bc7ecb297e767a66e9db'}]}, 'timestamp': '2026-02-01 10:05:03.582021', '_unique_id': 'ed6420c4a69243cd836930cabaaea6de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.582 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c22ad76-362a-4c54-ba94-31fab2b1c5f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.583010', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a312b6e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': '4ef71977389cf01def92ec8d0a4d8af5430e13e72e0366890508f14703ff122c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.583010', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a31329e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': 'b313c3baef5198e86e48d495f474846d6c6fb3ced3caf60644a24f658a5faccd'}]}, 'timestamp': '2026-02-01 10:05:03.583392', '_unique_id': '1a7c6edd81b44aa790b01ea157b1507c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.583 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.584 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24327947-d187-424d-b1c7-2155007d0257', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.584357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a315fe4-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': '1336a5ec26493684ad2ccbf76f445de57cfaab0c22a37c96037d4add4a537d8e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.584357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a31670a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': 'eae0d7605f862196d2fe45f97d49493b11b1c1a82f59cf8c8be73fc1a93e5778'}]}, 'timestamp': '2026-02-01 10:05:03.584734', '_unique_id': '56ced9f97d4d4d758f53e8e98ba9f6aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.585 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e2ae20a-3673-4364-9378-1da6bb591a2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.585732', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a3195b8-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': 'a5fcd1cf77db70bc89545ba9c1395df2f59f6f4b6f2880a5d5a407eec489990d'}]}, 'timestamp': '2026-02-01 10:05:03.585944', '_unique_id': '24b597a4ba764717aeb08c98651c2188'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.586 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b817befb-85bc-4f31-a6ed-498904cf15b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.586902', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a31c40c-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': '33f01d5379f46754e47f2f26b43829df0617a9a8f68149f14523436a4268c5ee'}]}, 'timestamp': '2026-02-01 10:05:03.587131', '_unique_id': '35a53e5b32924e00916e66432ed0a9a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.587 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0aa90fe-3bda-4d54-b201-7638a15f8519', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:05:03.588311', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7a31fa6c-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.781786267, 'message_signature': '7701022248ae6c59c88b325744acef247cf43b63abcb17e4b6db91b065e22c34'}]}, 'timestamp': '2026-02-01 10:05:03.588519', '_unique_id': 'c267d5f3379e4491a31b3a73082d564e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.588 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.589 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfa17a54-e7c7-47ab-8286-e15382c0b2fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.589472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a3227d0-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': 'dbfe9eac3aed3d33cff331b5ac0c2a0c4e7fcbe9a83dbaf5b1d78f47aa45c7f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.589472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a322f0a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': '0033b241ed6f068c6614b82413c11ff58fb6bcd90c0e034dc0cddc0bc2fe796e'}]}, 'timestamp': '2026-02-01 10:05:03.589855', '_unique_id': '61833efb2099453bb60ac4167b7b7731'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.590 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad26bd61-c9bb-4237-b121-6ec84042ce9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.590961', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a3262b8-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': 'f0040a94c60db2d5363a7536e3449f541c03079618a063019229b2d7b0f7573a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.590961', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a326bf0-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': '0f99660951654cb1ce8d67cec4d0f84f8ecf32c1d81c8d8529476ea3e73e979b'}]}, 'timestamp': '2026-02-01 10:05:03.591417', '_unique_id': '34b9659298c349958f106ea89ca03b44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.591 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.592 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.592 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.592 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a5e0f02-bedc-4222-8c8b-4cb7f1a1725c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.592385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a329986-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': 'aed2306df965e0ee41ffd21168f2aa53c52b613cdba72429567ce3e16510ab5b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.592385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a32a106-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.784740048, 'message_signature': '9daa3cebf3ce5cbdebbf622d906badb213546028f54d75a009108e6737d33959'}]}, 'timestamp': '2026-02-01 10:05:03.592773', '_unique_id': '4149920d23b64b76b54eeaceb52d0464'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.593 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ff6b1a6-9ff6-4104-bad3-da0918129841', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:05:03.593859', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': '7a32d310-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.752408453, 'message_signature': '8aeac92b2100320067cdf97dafff6a068d7422acc0732917d9fdf42ac0bbe728'}]}, 'timestamp': '2026-02-01 10:05:03.594091', '_unique_id': 'ac1de60074844a7ea4fbedac6945d1e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.594 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17d6a648-751e-4587-a8c8-0e13acca86b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:05:03.595068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7a33024a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.75845915, 'message_signature': '703e763d34f720a5a507e07fc82b9611d7bac378eaed710e1842c4b07eae1deb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:05:03.595068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7a330984-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12397.75845915, 'message_signature': '62b6948b31b23278e453cc329b4d5e23f3ace790cbb344fd834016ac0c399b35'}]}, 'timestamp': '2026-02-01 10:05:03.595449', '_unique_id': '31943c07d5c14c6792d42404e1ec8ec5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:05:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:05:03.595 12 ERROR oslo_messaging.notify.messaging Feb 1 05:05:04 localhost nova_compute[274651]: 2026-02-01 10:05:04.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:04 localhost nova_compute[274651]: 2026-02-01 10:05:04.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:05 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:05:05 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:05 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:05:05 localhost nova_compute[274651]: 2026-02-01 10:05:05.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:05 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:05:05 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:05 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:05 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.292 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.293 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.293 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.294 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.294 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:05:06 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:05:06 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1409088241' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.729 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.809 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:05:06 localhost nova_compute[274651]: 2026-02-01 10:05:06.810 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.048 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.049 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11127MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.050 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.050 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:05:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.146 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.147 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.147 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.200 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:05:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:05:07 localhost podman[329219]: 2026-02-01 10:05:07.325925092 +0000 UTC m=+0.058330983 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 05:05:07 localhost podman[329219]: 2026-02-01 10:05:07.334381362 +0000 UTC m=+0.066787243 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 1 05:05:07 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:05:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:05:07 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1480131941' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.647 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.652 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.690 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.692 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:05:07 localhost nova_compute[274651]: 2026-02-01 10:05:07.692 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:05:08 localhost nova_compute[274651]: 2026-02-01 10:05:08.167 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:08 localhost nova_compute[274651]: 2026-02-01 10:05:08.169 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:08 localhost nova_compute[274651]: 2026-02-01 10:05:08.170 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:05:08 localhost nova_compute[274651]: 2026-02-01 10:05:08.170 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:08 localhost nova_compute[274651]: 2026-02-01 10:05:08.192 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:08 localhost nova_compute[274651]: 2026-02-01 10:05:08.193 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:05:08 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:05:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:05:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:05:08 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:05:08 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:05:08 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:05:08 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:05:08 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:05:08 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:05:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e294 do_prune osdmap full prune enabled Feb 1 05:05:09 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e295 e295: 6 total, 6 up, 6 in Feb 1 05:05:09 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e295: 6 total, 6 up, 6 in Feb 1 05:05:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:11.406 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:05:11 localhost nova_compute[274651]: 2026-02-01 10:05:11.406 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:11.408 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 05:05:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:05:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:11 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:05:11 localhost nova_compute[274651]: 2026-02-01 10:05:11.693 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:11 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:05:11 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:11 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:11 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:05:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:05:11 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:05:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:13 localhost nova_compute[274651]: 2026-02-01 10:05:13.242 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:05:15 localhost podman[329259]: 2026-02-01 10:05:15.031922881 +0000 UTC m=+0.388725044 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 1 05:05:15 localhost podman[329259]: 2026-02-01 10:05:15.069404544 +0000 UTC m=+0.426206697 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:05:15 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:05:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 1 05:05:15 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:05:15 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:05:15 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:05:15 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:05:15 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:05:16 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:05:16 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 1 05:05:16 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 1 05:05:16 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 1 05:05:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:05:16 localhost podman[329281]: 2026-02-01 10:05:16.705531873 +0000 UTC m=+0.067045473 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent) Feb 1 05:05:16 localhost podman[329281]: 2026-02-01 10:05:16.73866403 +0000 UTC m=+0.100177670 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Feb 1 05:05:16 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:05:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e295 do_prune osdmap full prune enabled Feb 1 05:05:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e296 e296: 6 total, 6 up, 6 in Feb 1 05:05:17 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e296: 6 total, 6 up, 6 in Feb 1 05:05:17 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:17.639 259320 INFO neutron.agent.linux.ip_lib [None req-4635b439-412e-4df8-b72d-9b702af19e02 - - - - - -] Device tap715babe2-1b cannot be used as it has no MAC address#033[00m Feb 1 05:05:17 localhost nova_compute[274651]: 2026-02-01 10:05:17.660 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:17 localhost kernel: device tap715babe2-1b entered promiscuous mode Feb 1 05:05:17 localhost ovn_controller[152492]: 2026-02-01T10:05:17Z|00504|binding|INFO|Claiming lport 715babe2-1bae-4d19-8d18-f51857ccdf59 for this chassis. Feb 1 05:05:17 localhost ovn_controller[152492]: 2026-02-01T10:05:17Z|00505|binding|INFO|715babe2-1bae-4d19-8d18-f51857ccdf59: Claiming unknown Feb 1 05:05:17 localhost NetworkManager[5964]: [1769940317.6736] manager: (tap715babe2-1b): new Generic device (/org/freedesktop/NetworkManager/Devices/82) Feb 1 05:05:17 localhost nova_compute[274651]: 2026-02-01 10:05:17.673 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:17 localhost systemd-udevd[329309]: Network interface NamePolicy= disabled on kernel command line. Feb 1 05:05:17 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:17.681 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1428d90a-5672-462d-b506-14407c75d5b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1428d90a-5672-462d-b506-14407c75d5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '04317eeb5c63484689ca36c31f49888b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbf0b738-d1e9-4604-87ae-f9b4fe329f05, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=715babe2-1bae-4d19-8d18-f51857ccdf59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:05:17 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:17.684 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 715babe2-1bae-4d19-8d18-f51857ccdf59 in datapath 1428d90a-5672-462d-b506-14407c75d5b1 bound to our chassis#033[00m Feb 1 05:05:17 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:17.686 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port 1822ae04-22b7-47d4-867f-da1eb754a827 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 05:05:17 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:17.686 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1428d90a-5672-462d-b506-14407c75d5b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:05:17 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:17.688 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb65914-63de-4781-93b6-99352b063eb8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost ovn_controller[152492]: 2026-02-01T10:05:17Z|00506|binding|INFO|Setting lport 715babe2-1bae-4d19-8d18-f51857ccdf59 ovn-installed in OVS Feb 1 05:05:17 localhost ovn_controller[152492]: 2026-02-01T10:05:17Z|00507|binding|INFO|Setting lport 715babe2-1bae-4d19-8d18-f51857ccdf59 up in Southbound Feb 1 05:05:17 localhost nova_compute[274651]: 2026-02-01 10:05:17.703 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost nova_compute[274651]: 2026-02-01 10:05:17.728 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:17 localhost journal[217584]: ethtool ioctl error on tap715babe2-1b: No such device Feb 1 05:05:17 localhost nova_compute[274651]: 2026-02-01 10:05:17.753 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:18 localhost nova_compute[274651]: 2026-02-01 10:05:18.296 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:18 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 1 05:05:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0) Feb 1 05:05:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:18 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:05:18 localhost nova_compute[274651]: 2026-02-01 10:05:18.585 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:05:18 localhost systemd[1]: tmp-crun.8ebFEx.mount: Deactivated successfully. Feb 1 05:05:18 localhost podman[329380]: 2026-02-01 10:05:18.695718147 +0000 UTC m=+0.059623994 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 05:05:18 localhost podman[329381]: Feb 1 05:05:18 localhost podman[329380]: 2026-02-01 10:05:18.703721654 +0000 UTC m=+0.067627531 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:05:18 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:05:18 localhost podman[329381]: 2026-02-01 10:05:18.760374325 +0000 UTC m=+0.123828688 container create 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:05:18 localhost podman[329381]: 2026-02-01 10:05:18.669955425 +0000 UTC m=+0.033409808 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 05:05:18 localhost systemd[1]: Started libpod-conmon-1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384.scope. Feb 1 05:05:18 localhost systemd[1]: Started libcrun container. Feb 1 05:05:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff34c799e3d7be428db3fcc156575bb8ea9e53ec83525757ad23c6bbdf44d917/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 05:05:18 localhost podman[329381]: 2026-02-01 10:05:18.832585916 +0000 UTC m=+0.196040319 container init 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:05:18 localhost podman[329381]: 2026-02-01 10:05:18.843059408 +0000 UTC m=+0.206513851 container start 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 05:05:18 localhost dnsmasq[329419]: started, version 2.85 cachesize 150 Feb 1 05:05:18 localhost dnsmasq[329419]: DNS service limited to local subnets Feb 1 05:05:18 localhost dnsmasq[329419]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 05:05:18 localhost dnsmasq[329419]: warning: no upstream servers configured Feb 1 05:05:18 localhost dnsmasq-dhcp[329419]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 05:05:18 localhost dnsmasq[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/addn_hosts - 0 addresses Feb 1 05:05:18 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/host Feb 1 05:05:18 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/opts Feb 1 05:05:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:19.074 259320 INFO neutron.agent.dhcp.agent [None req-ff9f00fc-43c4-489f-939e-5af32c8e23da - - - - - -] DHCP configuration for ports {'89487862-c78c-4ec7-b17c-8d9f68c9cb3e'} is completed#033[00m Feb 1 05:05:19 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:05:19 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:05:19 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:19.392 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:05:19Z, description=, device_id=5f8b473c-8ebf-456f-9d1e-1f9a8f8f51c2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e739112c-3c87-4725-943f-84f7977ec397, ip_allocation=immediate, mac_address=fa:16:3e:89:d9:72, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:05:15Z, description=, dns_domain=, id=1428d90a-5672-462d-b506-14407c75d5b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2138049840-network, port_security_enabled=True, project_id=04317eeb5c63484689ca36c31f49888b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39317, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3814, status=ACTIVE, subnets=['9977fd25-c059-4de5-82ee-eea15fe7f29d'], tags=[], tenant_id=04317eeb5c63484689ca36c31f49888b, updated_at=2026-02-01T10:05:15Z, vlan_transparent=None, network_id=1428d90a-5672-462d-b506-14407c75d5b1, port_security_enabled=False, project_id=04317eeb5c63484689ca36c31f49888b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3822, status=DOWN, tags=[], tenant_id=04317eeb5c63484689ca36c31f49888b, updated_at=2026-02-01T10:05:19Z on network 1428d90a-5672-462d-b506-14407c75d5b1#033[00m Feb 1 05:05:19 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:19.409 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 05:05:19 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:19 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch Feb 1 05:05:19 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished Feb 1 05:05:19 localhost dnsmasq[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/addn_hosts - 1 addresses Feb 1 05:05:19 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/host Feb 1 05:05:19 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/opts Feb 1 05:05:19 localhost podman[329436]: 2026-02-01 10:05:19.612489547 +0000 UTC m=+0.063111121 container kill 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:05:20 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:20.133 259320 INFO neutron.agent.dhcp.agent [None req-09e23c4e-998b-40a3-ad43-b8987835a94d - - - - - -] DHCP configuration for ports {'e739112c-3c87-4725-943f-84f7977ec397'} is completed#033[00m Feb 1 05:05:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:21.224 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:05:19Z, description=, device_id=5f8b473c-8ebf-456f-9d1e-1f9a8f8f51c2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e739112c-3c87-4725-943f-84f7977ec397, ip_allocation=immediate, mac_address=fa:16:3e:89:d9:72, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:05:15Z, description=, dns_domain=, id=1428d90a-5672-462d-b506-14407c75d5b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-2138049840-network, port_security_enabled=True, project_id=04317eeb5c63484689ca36c31f49888b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39317, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3814, status=ACTIVE, subnets=['9977fd25-c059-4de5-82ee-eea15fe7f29d'], tags=[], tenant_id=04317eeb5c63484689ca36c31f49888b, updated_at=2026-02-01T10:05:15Z, vlan_transparent=None, network_id=1428d90a-5672-462d-b506-14407c75d5b1, port_security_enabled=False, project_id=04317eeb5c63484689ca36c31f49888b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3822, status=DOWN, tags=[], tenant_id=04317eeb5c63484689ca36c31f49888b, updated_at=2026-02-01T10:05:19Z on network 1428d90a-5672-462d-b506-14407c75d5b1#033[00m Feb 1 05:05:21 localhost dnsmasq[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/addn_hosts - 1 addresses Feb 1 05:05:21 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/host Feb 1 05:05:21 localhost podman[329474]: 2026-02-01 10:05:21.462902165 +0000 UTC m=+0.063154604 container kill 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127) Feb 1 05:05:21 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/opts Feb 1 05:05:21 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:21.701 259320 INFO neutron.agent.dhcp.agent [None req-7b2ab3a1-8452-411a-ba90-03f697b899ab - - - - - -] DHCP configuration for ports {'e739112c-3c87-4725-943f-84f7977ec397'} is completed#033[00m Feb 1 05:05:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:05:22 localhost podman[329496]: 2026-02-01 10:05:22.713339263 +0000 UTC m=+0.076849713 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:05:22 localhost podman[329496]: 2026-02-01 10:05:22.752421595 +0000 UTC m=+0.115931985 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 1 05:05:22 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:05:23 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:05:23 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:05:23 localhost nova_compute[274651]: 2026-02-01 10:05:23.328 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:23 localhost podman[236886]: time="2026-02-01T10:05:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:05:23 localhost podman[236886]: @ - - [01/Feb/2026:10:05:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158361 "" "Go-http-client/1.1" Feb 1 05:05:24 localhost podman[236886]: @ - - [01/Feb/2026:10:05:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19316 "" "Go-http-client/1.1" Feb 1 05:05:26 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} v 0) Feb 1 05:05:26 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} : dispatch Feb 1 05:05:26 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]}]': finished Feb 1 05:05:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:27 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 1 05:05:27 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} : dispatch Feb 1 05:05:27 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} : dispatch Feb 1 05:05:27 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]}]': finished Feb 1 05:05:27 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 1 05:05:28 localhost nova_compute[274651]: 2026-02-01 10:05:28.329 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:28 localhost nova_compute[274651]: 2026-02-01 10:05:28.331 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:28 localhost nova_compute[274651]: 2026-02-01 10:05:28.331 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:05:28 localhost nova_compute[274651]: 2026-02-01 10:05:28.332 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:28 localhost nova_compute[274651]: 2026-02-01 10:05:28.364 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:28 localhost nova_compute[274651]: 2026-02-01 10:05:28.365 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:29 localhost nova_compute[274651]: 2026-02-01 10:05:29.508 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:05:29 localhost podman[329522]: 2026-02-01 10:05:29.725300252 +0000 UTC m=+0.087746739 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, release=1769056855, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, version=9.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 1 05:05:29 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} v 0) Feb 1 05:05:29 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} : dispatch Feb 1 05:05:29 localhost podman[329522]: 2026-02-01 10:05:29.764553319 +0000 UTC m=+0.126999876 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1769056855, build-date=2026-01-22T05:09:47Z, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc) Feb 1 05:05:29 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]}]': finished Feb 1 05:05:29 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:05:29 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=bob,client_metadata.root=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e],prefix=session evict} (starting...) Feb 1 05:05:30 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} : dispatch Feb 1 05:05:30 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 1 05:05:30 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} : dispatch Feb 1 05:05:30 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]}]': finished Feb 1 05:05:31 localhost openstack_network_exporter[239441]: ERROR 10:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:05:31 localhost openstack_network_exporter[239441]: Feb 1 05:05:31 localhost openstack_network_exporter[239441]: ERROR 10:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:05:31 localhost openstack_network_exporter[239441]: Feb 1 05:05:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:33 localhost nova_compute[274651]: 2026-02-01 10:05:33.405 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:33 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0) Feb 1 05:05:33 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Feb 1 05:05:33 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Feb 1 05:05:33 localhost ceph-mds[277455]: mds.mds.np0005604212.tkdkxt asok_command: session evict {filters=[auth_name=bob,client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37],prefix=session evict} (starting...) Feb 1 05:05:33 localhost nova_compute[274651]: 2026-02-01 10:05:33.685 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:34 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Feb 1 05:05:34 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 1 05:05:34 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Feb 1 05:05:34 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Feb 1 05:05:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:05:37 localhost podman[329542]: 2026-02-01 10:05:37.758457099 +0000 UTC m=+0.116401490 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ceilometer_agent_compute) Feb 1 05:05:37 localhost podman[329542]: 2026-02-01 10:05:37.774685128 +0000 UTC m=+0.132629459 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 05:05:37 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:05:38 localhost nova_compute[274651]: 2026-02-01 10:05:38.446 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e296 do_prune osdmap full prune enabled Feb 1 05:05:39 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e297 e297: 6 total, 6 up, 6 in Feb 1 05:05:39 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e297: 6 total, 6 up, 6 in Feb 1 05:05:39 localhost nova_compute[274651]: 2026-02-01 10:05:39.340 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e297 do_prune osdmap full prune enabled Feb 1 05:05:40 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e298 e298: 6 total, 6 up, 6 in Feb 1 05:05:40 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e298: 6 total, 6 up, 6 in Feb 1 05:05:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:41.728 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:05:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:41.728 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:05:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:41.729 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:05:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:42 localhost dnsmasq[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/addn_hosts - 0 addresses Feb 1 05:05:42 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/host Feb 1 05:05:42 localhost podman[329576]: 2026-02-01 10:05:42.322180357 +0000 UTC m=+0.065493005 container kill 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 1 05:05:42 localhost dnsmasq-dhcp[329419]: read /var/lib/neutron/dhcp/1428d90a-5672-462d-b506-14407c75d5b1/opts Feb 1 05:05:42 localhost nova_compute[274651]: 2026-02-01 10:05:42.518 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:42 localhost kernel: device tap715babe2-1b left promiscuous mode Feb 1 05:05:42 localhost ovn_controller[152492]: 2026-02-01T10:05:42Z|00508|binding|INFO|Releasing lport 715babe2-1bae-4d19-8d18-f51857ccdf59 from this chassis (sb_readonly=0) Feb 1 05:05:42 localhost ovn_controller[152492]: 2026-02-01T10:05:42Z|00509|binding|INFO|Setting lport 715babe2-1bae-4d19-8d18-f51857ccdf59 down in Southbound Feb 1 05:05:42 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:42.528 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-1428d90a-5672-462d-b506-14407c75d5b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1428d90a-5672-462d-b506-14407c75d5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '04317eeb5c63484689ca36c31f49888b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fbf0b738-d1e9-4604-87ae-f9b4fe329f05, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=715babe2-1bae-4d19-8d18-f51857ccdf59) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:05:42 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:42.530 158365 INFO neutron.agent.ovn.metadata.agent [-] Port 715babe2-1bae-4d19-8d18-f51857ccdf59 in datapath 1428d90a-5672-462d-b506-14407c75d5b1 unbound from our chassis#033[00m Feb 1 05:05:42 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:42.532 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1428d90a-5672-462d-b506-14407c75d5b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:05:42 localhost ovn_metadata_agent[158360]: 2026-02-01 10:05:42.533 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[341bf2ca-40a3-403d-b202-67fffc4f6cf1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:05:42 localhost nova_compute[274651]: 2026-02-01 10:05:42.543 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:43 localhost nova_compute[274651]: 2026-02-01 10:05:43.469 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:43 localhost ovn_controller[152492]: 2026-02-01T10:05:43Z|00510|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:05:43 localhost nova_compute[274651]: 2026-02-01 10:05:43.813 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:44 localhost dnsmasq[329419]: exiting on receipt of SIGTERM Feb 1 05:05:44 localhost systemd[1]: tmp-crun.nyRQhY.mount: Deactivated successfully. Feb 1 05:05:44 localhost systemd[1]: libpod-1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384.scope: Deactivated successfully. Feb 1 05:05:44 localhost podman[329617]: 2026-02-01 10:05:44.287145027 +0000 UTC m=+0.064512855 container kill 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 05:05:44 localhost podman[329630]: 2026-02-01 10:05:44.33405755 +0000 UTC m=+0.041092675 container died 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Feb 1 05:05:44 localhost systemd[1]: tmp-crun.EZOcW0.mount: Deactivated successfully. Feb 1 05:05:44 localhost podman[329630]: 2026-02-01 10:05:44.371910303 +0000 UTC m=+0.078945398 container cleanup 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127) Feb 1 05:05:44 localhost systemd[1]: libpod-conmon-1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384.scope: Deactivated successfully. Feb 1 05:05:44 localhost podman[329637]: 2026-02-01 10:05:44.41178749 +0000 UTC m=+0.099979926 container remove 1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1428d90a-5672-462d-b506-14407c75d5b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:05:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:44.441 259320 INFO neutron.agent.dhcp.agent [None req-305443ef-bf43-4e37-ad27-ba27a483b63b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:05:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:05:44.849 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:05:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:05:45 localhost podman[329660]: 2026-02-01 10:05:45.218515186 +0000 UTC m=+0.079260159 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:05:45 localhost podman[329660]: 2026-02-01 10:05:45.251875702 +0000 UTC m=+0.112620695 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 1 05:05:45 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:05:45 localhost systemd[1]: var-lib-containers-storage-overlay-ff34c799e3d7be428db3fcc156575bb8ea9e53ec83525757ad23c6bbdf44d917-merged.mount: Deactivated successfully. Feb 1 05:05:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1d43078b82d1136d96e048e4e6b8c585f7923bd81f1b7d1f914c14e7fd778384-userdata-shm.mount: Deactivated successfully. Feb 1 05:05:45 localhost systemd[1]: run-netns-qdhcp\x2d1428d90a\x2d5672\x2d462d\x2db506\x2d14407c75d5b1.mount: Deactivated successfully. Feb 1 05:05:46 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:05:46 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:05:47 localhost ovn_controller[152492]: 2026-02-01T10:05:47Z|00511|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:05:47 localhost nova_compute[274651]: 2026-02-01 10:05:47.091 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e298 do_prune osdmap full prune enabled Feb 1 05:05:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 e299: 6 total, 6 up, 6 in Feb 1 05:05:47 localhost ceph-mon[286721]: log_channel(cluster) log [DBG] : osdmap e299: 6 total, 6 up, 6 in Feb 1 05:05:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:05:47 localhost podman[329683]: 2026-02-01 10:05:47.714472963 +0000 UTC m=+0.077867995 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 1 05:05:47 localhost podman[329683]: 2026-02-01 10:05:47.748373435 +0000 UTC m=+0.111768447 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127) Feb 1 05:05:47 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:05:48 localhost nova_compute[274651]: 2026-02-01 10:05:48.513 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:49 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:05:49 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:05:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:05:49 localhost systemd[1]: tmp-crun.mcGlC2.mount: Deactivated successfully. Feb 1 05:05:49 localhost podman[329719]: 2026-02-01 10:05:49.678264727 +0000 UTC m=+0.079483316 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:05:49 localhost podman[329719]: 2026-02-01 10:05:49.690201633 +0000 UTC m=+0.091420242 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 1 05:05:49 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:05:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0) Feb 1 05:05:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0) Feb 1 05:05:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0) Feb 1 05:05:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0) Feb 1 05:05:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0) Feb 1 05:05:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0) Feb 1 05:05:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:50 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 05:05:51 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:51 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 05:05:51 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:51 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 05:05:51 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:52 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0. Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.903908) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76 Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940352903952, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1365, "num_deletes": 253, "total_data_size": 1193122, "memory_usage": 1218416, "flush_reason": "Manual Compaction"} Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940352911627, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1146304, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41127, "largest_seqno": 42491, "table_properties": {"data_size": 1140040, "index_size": 3350, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 15788, "raw_average_key_size": 21, "raw_value_size": 1126669, "raw_average_value_size": 1560, "num_data_blocks": 141, "num_entries": 722, "num_filter_entries": 722, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940287, "oldest_key_time": 1769940287, "file_creation_time": 1769940352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}} Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 7767 microseconds, and 4018 cpu microseconds. Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.911674) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1146304 bytes OK Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.911696) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.914050) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.914069) EVENT_LOG_v1 {"time_micros": 1769940352914063, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.914093) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 1186638, prev total WAL file size 1186638, number of live WAL files 2. Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.914656) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end) Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1119KB)], [75(21MB)] Feb 1 05:05:52 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940352914695, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 23967654, "oldest_snapshot_seqno": -1} Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 14586 keys, 22436385 bytes, temperature: kUnknown Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940353020876, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 22436385, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22352255, "index_size": 46589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36485, "raw_key_size": 392382, "raw_average_key_size": 26, "raw_value_size": 22103683, "raw_average_value_size": 1515, "num_data_blocks": 1727, "num_entries": 14586, "num_filter_entries": 14586, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938956, "oldest_key_time": 0, "file_creation_time": 1769940352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "45378c7f-5201-4192-8849-dfb55e3150db", "db_session_id": "0OACS8BUSD4GZ2BGBVU8", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}} Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.021259) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 22436385 bytes Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.023026) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 225.4 rd, 211.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 21.8 +0.0 blob) out(21.4 +0.0 blob), read-write-amplify(40.5) write-amplify(19.6) OK, records in: 15122, records dropped: 536 output_compression: NoCompression Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.023056) EVENT_LOG_v1 {"time_micros": 1769940353023043, "job": 46, "event": "compaction_finished", "compaction_time_micros": 106334, "compaction_time_cpu_micros": 56111, "output_level": 6, "num_output_files": 1, "total_output_size": 22436385, "num_input_records": 15122, "num_output_records": 14586, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940353023345, "job": 46, "event": "table_file_deletion", "file_number": 77} Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604212/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940353026482, "job": 46, "event": "table_file_deletion", "file_number": 75} Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:52.914590) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.026627) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.026637) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.026642) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.026646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:05:53 localhost ceph-mon[286721]: rocksdb: (Original Log Time 2026/02/01-10:05:53.026650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 1 05:05:53 localhost nova_compute[274651]: 2026-02-01 10:05:53.516 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:53 localhost nova_compute[274651]: 2026-02-01 10:05:53.518 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:53 localhost nova_compute[274651]: 2026-02-01 10:05:53.518 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:05:53 localhost nova_compute[274651]: 2026-02-01 10:05:53.518 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:53 localhost nova_compute[274651]: 2026-02-01 10:05:53.546 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:53 localhost nova_compute[274651]: 2026-02-01 10:05:53.547 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:05:53 localhost systemd[1]: tmp-crun.iUO76y.mount: Deactivated successfully. Feb 1 05:05:53 localhost podman[329868]: 2026-02-01 10:05:53.732265281 +0000 UTC m=+0.093051962 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127) Feb 1 05:05:53 localhost podman[329868]: 2026-02-01 10:05:53.771221499 +0000 UTC m=+0.132008170 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 05:05:53 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:05:53 localhost podman[236886]: time="2026-02-01T10:05:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:05:53 localhost podman[236886]: @ - - [01/Feb/2026:10:05:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:05:54 localhost podman[236886]: @ - - [01/Feb/2026:10:05:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18842 "" "Go-http-client/1.1" Feb 1 05:05:55 localhost sshd[329893]: main: sshd: ssh-rsa algorithm is disabled Feb 1 05:05:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.295 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.296 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.296 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.413 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.413 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.413 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.414 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.548 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.550 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.550 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.550 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.579 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.580 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.835 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.853 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.854 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 05:05:58 localhost nova_compute[274651]: 2026-02-01 10:05:58.854 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:06:00 localhost podman[329895]: 2026-02-01 10:06:00.721867851 +0000 UTC m=+0.084529220 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1769056855, version=9.7, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, name=ubi9/ubi-minimal, vcs-type=git, io.buildah.version=1.33.7) Feb 1 05:06:00 localhost podman[329895]: 2026-02-01 10:06:00.740497083 +0000 UTC m=+0.103158442 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, release=1769056855, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9/ubi-minimal) Feb 1 05:06:00 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:06:01 localhost openstack_network_exporter[239441]: ERROR 10:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:06:01 localhost openstack_network_exporter[239441]: Feb 1 05:06:01 localhost openstack_network_exporter[239441]: ERROR 10:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:06:01 localhost openstack_network_exporter[239441]: Feb 1 05:06:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:02 localhost nova_compute[274651]: 2026-02-01 10:06:02.825 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:03 localhost nova_compute[274651]: 2026-02-01 10:06:03.581 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:03 localhost nova_compute[274651]: 2026-02-01 10:06:03.615 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:03 localhost nova_compute[274651]: 2026-02-01 10:06:03.615 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:06:03 localhost nova_compute[274651]: 2026-02-01 10:06:03.616 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:03 localhost nova_compute[274651]: 2026-02-01 10:06:03.616 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:03 localhost nova_compute[274651]: 2026-02-01 10:06:03.618 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:03 localhost nova_compute[274651]: 2026-02-01 10:06:03.622 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:05 localhost nova_compute[274651]: 2026-02-01 10:06:05.268 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:05 localhost nova_compute[274651]: 2026-02-01 10:06:05.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:05 localhost nova_compute[274651]: 2026-02-01 10:06:05.269 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 05:06:06 localhost nova_compute[274651]: 2026-02-01 10:06:06.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:06 localhost nova_compute[274651]: 2026-02-01 10:06:06.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:07 localhost nova_compute[274651]: 2026-02-01 10:06:07.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:07 localhost nova_compute[274651]: 2026-02-01 10:06:07.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.285 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.307 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.307 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.307 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.308 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.308 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.623 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:06:08 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:06:08 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2063868740' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:06:08 localhost podman[329936]: 2026-02-01 10:06:08.727049879 +0000 UTC m=+0.082813167 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.739 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:06:08 localhost podman[329936]: 2026-02-01 10:06:08.765337527 +0000 UTC m=+0.121100795 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Feb 1 05:06:08 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.816 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:06:08 localhost nova_compute[274651]: 2026-02-01 10:06:08.817 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.018 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.019 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11093MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.019 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.020 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.302 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.303 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.303 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.503 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing inventories for resource provider a04bda90-8ccd-4104-8518-038544ff1327 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.643 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating ProviderTree inventory for provider a04bda90-8ccd-4104-8518-038544ff1327 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.644 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Updating inventory in ProviderTree for provider a04bda90-8ccd-4104-8518-038544ff1327 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.659 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing aggregate associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.695 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Refreshing trait associations for resource provider a04bda90-8ccd-4104-8518-038544ff1327, traits: COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE41,HW_CPU_X86_AMD_SVM,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_BMI2,HW_CPU_X86_SHA,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,HW_CPU_X86_MMX,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 1 05:06:09 localhost nova_compute[274651]: 2026-02-01 10:06:09.745 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:06:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:06:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2000731624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.208 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.216 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.250 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.253 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.254 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.254 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.255 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.273 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 1 05:06:10 localhost nova_compute[274651]: 2026-02-01 10:06:10.273 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:11.780 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:06:11 localhost nova_compute[274651]: 2026-02-01 10:06:11.781 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:11 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:11.782 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 1 05:06:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:06:11 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:06:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:13 localhost nova_compute[274651]: 2026-02-01 10:06:13.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:13 localhost nova_compute[274651]: 2026-02-01 10:06:13.291 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:13 localhost nova_compute[274651]: 2026-02-01 10:06:13.672 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:13 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:13.784 158365 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e1d14e36-ae9d-43b6-8933-f137b54529ff, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 1 05:06:14 localhost nova_compute[274651]: 2026-02-01 10:06:14.302 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:14 localhost nova_compute[274651]: 2026-02-01 10:06:14.320 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Triggering sync for uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 1 05:06:14 localhost nova_compute[274651]: 2026-02-01 10:06:14.321 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:06:14 localhost nova_compute[274651]: 2026-02-01 10:06:14.321 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:06:14 localhost nova_compute[274651]: 2026-02-01 10:06:14.345 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:06:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:06:15 localhost podman[329979]: 2026-02-01 10:06:15.712080379 +0000 UTC m=+0.075593136 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 05:06:15 localhost podman[329979]: 2026-02-01 10:06:15.747556789 +0000 UTC m=+0.111069526 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:06:15 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:06:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:17 localhost ovn_controller[152492]: 2026-02-01T10:06:17Z|00512|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory Feb 1 05:06:18 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 1 05:06:18 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 1 05:06:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:06:18 localhost nova_compute[274651]: 2026-02-01 10:06:18.674 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:18 localhost nova_compute[274651]: 2026-02-01 10:06:18.676 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:18 localhost nova_compute[274651]: 2026-02-01 10:06:18.676 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:06:18 localhost nova_compute[274651]: 2026-02-01 10:06:18.676 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:18 localhost nova_compute[274651]: 2026-02-01 10:06:18.706 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:18 localhost nova_compute[274651]: 2026-02-01 10:06:18.707 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:18 localhost podman[330003]: 2026-02-01 10:06:18.739493446 +0000 UTC m=+0.105989579 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:06:18 localhost podman[330003]: 2026-02-01 10:06:18.773382929 +0000 UTC m=+0.139879082 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image) Feb 1 05:06:18 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:06:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:06:20 localhost systemd[1]: tmp-crun.mm7JU8.mount: Deactivated successfully. Feb 1 05:06:20 localhost podman[330019]: 2026-02-01 10:06:20.717433925 +0000 UTC m=+0.074086109 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:06:20 localhost podman[330019]: 2026-02-01 10:06:20.723355837 +0000 UTC m=+0.080008011 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:06:20 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:06:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:23 localhost nova_compute[274651]: 2026-02-01 10:06:23.707 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:23 localhost nova_compute[274651]: 2026-02-01 10:06:23.709 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:23 localhost nova_compute[274651]: 2026-02-01 10:06:23.710 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:06:23 localhost nova_compute[274651]: 2026-02-01 10:06:23.710 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:23 localhost nova_compute[274651]: 2026-02-01 10:06:23.738 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:23 localhost nova_compute[274651]: 2026-02-01 10:06:23.739 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:23 localhost nova_compute[274651]: 2026-02-01 10:06:23.742 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:23 localhost podman[236886]: time="2026-02-01T10:06:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:06:23 localhost podman[236886]: @ - - [01/Feb/2026:10:06:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:06:24 localhost podman[236886]: @ - - [01/Feb/2026:10:06:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18839 "" "Go-http-client/1.1" Feb 1 05:06:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:06:24 localhost podman[330043]: 2026-02-01 10:06:24.720483483 +0000 UTC m=+0.082390255 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.schema-version=1.0) Feb 1 05:06:24 localhost podman[330043]: 2026-02-01 10:06:24.787497883 +0000 UTC m=+0.149404655 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3) Feb 1 05:06:24 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:06:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:28 localhost nova_compute[274651]: 2026-02-01 10:06:28.742 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:28 localhost nova_compute[274651]: 2026-02-01 10:06:28.744 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:28 localhost nova_compute[274651]: 2026-02-01 10:06:28.744 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:06:28 localhost nova_compute[274651]: 2026-02-01 10:06:28.744 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:28 localhost nova_compute[274651]: 2026-02-01 10:06:28.784 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:28 localhost nova_compute[274651]: 2026-02-01 10:06:28.785 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:31 localhost openstack_network_exporter[239441]: ERROR 10:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:06:31 localhost openstack_network_exporter[239441]: Feb 1 05:06:31 localhost openstack_network_exporter[239441]: ERROR 10:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:06:31 localhost openstack_network_exporter[239441]: Feb 1 05:06:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:06:31 localhost podman[330069]: 2026-02-01 10:06:31.720513483 +0000 UTC m=+0.074815210 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, container_name=openstack_network_exporter, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.7) Feb 1 05:06:31 localhost podman[330069]: 2026-02-01 10:06:31.736596838 +0000 UTC m=+0.090898525 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., version=9.7) Feb 1 05:06:31 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:06:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:33 localhost nova_compute[274651]: 2026-02-01 10:06:33.785 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:33 localhost nova_compute[274651]: 2026-02-01 10:06:33.787 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:33 localhost nova_compute[274651]: 2026-02-01 10:06:33.788 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:06:33 localhost nova_compute[274651]: 2026-02-01 10:06:33.788 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:33 localhost nova_compute[274651]: 2026-02-01 10:06:33.817 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:33 localhost nova_compute[274651]: 2026-02-01 10:06:33.818 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:38 localhost nova_compute[274651]: 2026-02-01 10:06:38.820 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:38 localhost nova_compute[274651]: 2026-02-01 10:06:38.823 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:38 localhost nova_compute[274651]: 2026-02-01 10:06:38.823 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:06:38 localhost nova_compute[274651]: 2026-02-01 10:06:38.823 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:38 localhost nova_compute[274651]: 2026-02-01 10:06:38.863 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:38 localhost nova_compute[274651]: 2026-02-01 10:06:38.864 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:06:39 localhost podman[330089]: 2026-02-01 10:06:39.719240513 +0000 UTC m=+0.077055641 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 05:06:39 localhost podman[330089]: 2026-02-01 10:06:39.731472419 +0000 UTC m=+0.089287517 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 1 05:06:39 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:06:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:41.729 158365 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:06:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:41.729 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:06:41 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:41.730 158365 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:06:42 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:43 localhost nova_compute[274651]: 2026-02-01 10:06:43.864 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:43 localhost nova_compute[274651]: 2026-02-01 10:06:43.866 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:06:43 localhost nova_compute[274651]: 2026-02-01 10:06:43.867 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:06:43 localhost nova_compute[274651]: 2026-02-01 10:06:43.867 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:43 localhost nova_compute[274651]: 2026-02-01 10:06:43.885 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:43 localhost nova_compute[274651]: 2026-02-01 10:06:43.885 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:06:44 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:44.719 259320 INFO neutron.agent.linux.ip_lib [None req-0bdacd55-051b-41c7-9078-517f7c69bc38 - - - - - -] Device tapbdc632af-64 cannot be used as it has no MAC address#033[00m Feb 1 05:06:44 localhost nova_compute[274651]: 2026-02-01 10:06:44.743 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:44 localhost kernel: device tapbdc632af-64 entered promiscuous mode Feb 1 05:06:44 localhost NetworkManager[5964]: [1769940404.7564] manager: (tapbdc632af-64): new Generic device (/org/freedesktop/NetworkManager/Devices/83) Feb 1 05:06:44 localhost nova_compute[274651]: 2026-02-01 10:06:44.756 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:44 localhost systemd-udevd[330119]: Network interface NamePolicy= disabled on kernel command line. Feb 1 05:06:44 localhost ovn_controller[152492]: 2026-02-01T10:06:44Z|00513|binding|INFO|Claiming lport bdc632af-6463-48a8-95db-7b999bd82f40 for this chassis. Feb 1 05:06:44 localhost ovn_controller[152492]: 2026-02-01T10:06:44Z|00514|binding|INFO|bdc632af-6463-48a8-95db-7b999bd82f40: Claiming unknown Feb 1 05:06:44 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:44.772 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-801561a0-2ed8-4ee4-b5ee-fb979db78a17', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-801561a0-2ed8-4ee4-b5ee-fb979db78a17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25dc0fb66b3c4e4f9397bcae83feccae', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f98c56a-3d35-4a31-9622-fc7b70b3cfd4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bdc632af-6463-48a8-95db-7b999bd82f40) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:06:44 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:44.773 158365 INFO neutron.agent.ovn.metadata.agent [-] Port bdc632af-6463-48a8-95db-7b999bd82f40 in datapath 801561a0-2ed8-4ee4-b5ee-fb979db78a17 bound to our chassis#033[00m Feb 1 05:06:44 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:44.775 158365 DEBUG neutron.agent.ovn.metadata.agent [-] Port ece641fb-ca3f-4514-9110-9b01ed542053 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 1 05:06:44 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:44.775 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 801561a0-2ed8-4ee4-b5ee-fb979db78a17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:06:44 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:44.778 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6dc126-95bc-446b-97e9-6591159766c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:06:44 localhost ovn_controller[152492]: 2026-02-01T10:06:44Z|00515|binding|INFO|Setting lport bdc632af-6463-48a8-95db-7b999bd82f40 ovn-installed in OVS Feb 1 05:06:44 localhost ovn_controller[152492]: 2026-02-01T10:06:44Z|00516|binding|INFO|Setting lport bdc632af-6463-48a8-95db-7b999bd82f40 up in Southbound Feb 1 05:06:44 localhost nova_compute[274651]: 2026-02-01 10:06:44.799 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:44 localhost nova_compute[274651]: 2026-02-01 10:06:44.801 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:44 localhost nova_compute[274651]: 2026-02-01 10:06:44.832 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:44 localhost nova_compute[274651]: 2026-02-01 10:06:44.862 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:45 localhost podman[330174]: Feb 1 05:06:45 localhost podman[330174]: 2026-02-01 10:06:45.78609988 +0000 UTC m=+0.087899983 container create 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:06:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:06:45 localhost systemd[1]: Started libpod-conmon-319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921.scope. Feb 1 05:06:45 localhost systemd[1]: Started libcrun container. Feb 1 05:06:45 localhost podman[330174]: 2026-02-01 10:06:45.741982193 +0000 UTC m=+0.043782286 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 1 05:06:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79e3793a4204502eac9e5aa23e21cfa1a1fe630e42b67c3b544a57b72ebd56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 1 05:06:45 localhost podman[330174]: 2026-02-01 10:06:45.854657799 +0000 UTC m=+0.156457892 container init 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0) Feb 1 05:06:45 localhost podman[330174]: 2026-02-01 10:06:45.866042279 +0000 UTC m=+0.167842372 container start 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2) Feb 1 05:06:45 localhost dnsmasq[330203]: started, version 2.85 cachesize 150 Feb 1 05:06:45 localhost dnsmasq[330203]: DNS service limited to local subnets Feb 1 05:06:45 localhost dnsmasq[330203]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 1 05:06:45 localhost dnsmasq[330203]: warning: no upstream servers configured Feb 1 05:06:45 localhost dnsmasq-dhcp[330203]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 1 05:06:45 localhost dnsmasq[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/addn_hosts - 0 addresses Feb 1 05:06:45 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/host Feb 1 05:06:45 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/opts Feb 1 05:06:45 localhost podman[330188]: 2026-02-01 10:06:45.921801423 +0000 UTC m=+0.094253579 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 1 05:06:45 localhost podman[330188]: 2026-02-01 10:06:45.963476035 +0000 UTC m=+0.135928191 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 05:06:45 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:06:46 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:46.387 259320 INFO neutron.agent.dhcp.agent [None req-aaa3112b-078f-47e2-8906-ca45d599fc30 - - - - - -] DHCP configuration for ports {'d5f2c23b-1140-46b0-be5c-32db394d068a'} is completed#033[00m Feb 1 05:06:46 localhost systemd[1]: tmp-crun.VnDLaf.mount: Deactivated successfully. Feb 1 05:06:47 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:47 localhost nova_compute[274651]: 2026-02-01 10:06:47.246 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:48 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:48.249 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:06:47Z, description=, device_id=264f4cdb-0ee0-4c2e-84c0-4392019d1699, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6ff1c929-04b0-4625-bab2-069e140172e9, ip_allocation=immediate, mac_address=fa:16:3e:c6:58:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:06:42Z, description=, dns_domain=, id=801561a0-2ed8-4ee4-b5ee-fb979db78a17, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1572715263-network, port_security_enabled=True, project_id=25dc0fb66b3c4e4f9397bcae83feccae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30981, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3948, status=ACTIVE, subnets=['77c91413-26ce-4e31-96d9-45fd53c182d1'], tags=[], tenant_id=25dc0fb66b3c4e4f9397bcae83feccae, updated_at=2026-02-01T10:06:43Z, vlan_transparent=None, network_id=801561a0-2ed8-4ee4-b5ee-fb979db78a17, port_security_enabled=False, project_id=25dc0fb66b3c4e4f9397bcae83feccae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3955, status=DOWN, tags=[], tenant_id=25dc0fb66b3c4e4f9397bcae83feccae, updated_at=2026-02-01T10:06:47Z on network 801561a0-2ed8-4ee4-b5ee-fb979db78a17#033[00m Feb 1 05:06:48 localhost systemd[1]: tmp-crun.988qfF.mount: Deactivated successfully. Feb 1 05:06:48 localhost dnsmasq[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/addn_hosts - 1 addresses Feb 1 05:06:48 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/host Feb 1 05:06:48 localhost podman[330233]: 2026-02-01 10:06:48.46327231 +0000 UTC m=+0.062467791 container kill 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:06:48 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/opts Feb 1 05:06:48 localhost nova_compute[274651]: 2026-02-01 10:06:48.936 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:49.351 259320 INFO neutron.agent.dhcp.agent [None req-cfd334bb-4826-4e50-9b78-b6cf75131756 - - - - - -] DHCP configuration for ports {'6ff1c929-04b0-4625-bab2-069e140172e9'} is completed#033[00m Feb 1 05:06:49 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:49.603 259320 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:06:47Z, description=, device_id=264f4cdb-0ee0-4c2e-84c0-4392019d1699, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=6ff1c929-04b0-4625-bab2-069e140172e9, ip_allocation=immediate, mac_address=fa:16:3e:c6:58:1f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:06:42Z, description=, dns_domain=, id=801561a0-2ed8-4ee4-b5ee-fb979db78a17, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1572715263-network, port_security_enabled=True, project_id=25dc0fb66b3c4e4f9397bcae83feccae, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30981, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3948, status=ACTIVE, subnets=['77c91413-26ce-4e31-96d9-45fd53c182d1'], tags=[], tenant_id=25dc0fb66b3c4e4f9397bcae83feccae, updated_at=2026-02-01T10:06:43Z, vlan_transparent=None, network_id=801561a0-2ed8-4ee4-b5ee-fb979db78a17, port_security_enabled=False, project_id=25dc0fb66b3c4e4f9397bcae83feccae, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3955, status=DOWN, tags=[], tenant_id=25dc0fb66b3c4e4f9397bcae83feccae, updated_at=2026-02-01T10:06:47Z on network 801561a0-2ed8-4ee4-b5ee-fb979db78a17#033[00m Feb 1 05:06:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:06:49 localhost podman[330254]: 2026-02-01 10:06:49.731063082 +0000 UTC m=+0.086366565 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Feb 1 05:06:49 localhost podman[330254]: 2026-02-01 10:06:49.741474303 +0000 UTC m=+0.096777746 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Feb 1 05:06:49 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:06:49 localhost dnsmasq[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/addn_hosts - 1 addresses Feb 1 05:06:49 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/host Feb 1 05:06:49 localhost podman[330288]: 2026-02-01 10:06:49.780561055 +0000 UTC m=+0.032456950 container kill 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 1 05:06:49 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/opts Feb 1 05:06:50 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:50.119 259320 INFO neutron.agent.dhcp.agent [None req-f1687d80-1af9-4f74-9fb5-dcff81c06165 - - - - - -] DHCP configuration for ports {'6ff1c929-04b0-4625-bab2-069e140172e9'} is completed#033[00m Feb 1 05:06:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:06:51 localhost podman[330328]: 2026-02-01 10:06:51.499326124 +0000 UTC m=+0.063591096 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 1 05:06:51 localhost podman[330328]: 2026-02-01 10:06:51.53366241 +0000 UTC m=+0.097927372 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 1 05:06:51 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:06:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:52 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 1 05:06:52 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:06:53 localhost ceph-mon[286721]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 1 05:06:53 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:06:53 localhost dnsmasq[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/addn_hosts - 0 addresses Feb 1 05:06:53 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/host Feb 1 05:06:53 localhost dnsmasq-dhcp[330203]: read /var/lib/neutron/dhcp/801561a0-2ed8-4ee4-b5ee-fb979db78a17/opts Feb 1 05:06:53 localhost podman[330435]: 2026-02-01 10:06:53.277894462 +0000 UTC m=+0.061371337 container kill 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127) Feb 1 05:06:53 localhost nova_compute[274651]: 2026-02-01 10:06:53.452 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:53 localhost ovn_controller[152492]: 2026-02-01T10:06:53Z|00517|binding|INFO|Releasing lport bdc632af-6463-48a8-95db-7b999bd82f40 from this chassis (sb_readonly=0) Feb 1 05:06:53 localhost kernel: device tapbdc632af-64 left promiscuous mode Feb 1 05:06:53 localhost ovn_controller[152492]: 2026-02-01T10:06:53Z|00518|binding|INFO|Setting lport bdc632af-6463-48a8-95db-7b999bd82f40 down in Southbound Feb 1 05:06:53 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:53.463 158365 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604212.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpeb03ab10-e7f5-5714-93ad-723e14b6a239-801561a0-2ed8-4ee4-b5ee-fb979db78a17', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-801561a0-2ed8-4ee4-b5ee-fb979db78a17', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '25dc0fb66b3c4e4f9397bcae83feccae', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604212.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f98c56a-3d35-4a31-9622-fc7b70b3cfd4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bdc632af-6463-48a8-95db-7b999bd82f40) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 1 05:06:53 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:53.465 158365 INFO neutron.agent.ovn.metadata.agent [-] Port bdc632af-6463-48a8-95db-7b999bd82f40 in datapath 801561a0-2ed8-4ee4-b5ee-fb979db78a17 unbound from our chassis#033[00m Feb 1 05:06:53 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:53.468 158365 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 801561a0-2ed8-4ee4-b5ee-fb979db78a17, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 1 05:06:53 localhost ovn_metadata_agent[158360]: 2026-02-01 10:06:53.469 158526 DEBUG oslo.privsep.daemon [-] privsep: reply[95a8d0c7-c1b4-4030-b49b-8fb2bd01a4c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 1 05:06:53 localhost nova_compute[274651]: 2026-02-01 10:06:53.473 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:53 localhost podman[236886]: time="2026-02-01T10:06:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:06:53 localhost podman[236886]: @ - - [01/Feb/2026:10:06:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158361 "" "Go-http-client/1.1" Feb 1 05:06:53 localhost nova_compute[274651]: 2026-02-01 10:06:53.991 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:54 localhost podman[236886]: @ - - [01/Feb/2026:10:06:53 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19321 "" "Go-http-client/1.1" Feb 1 05:06:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:06:55 localhost systemd[1]: tmp-crun.uxRopW.mount: Deactivated successfully. Feb 1 05:06:55 localhost podman[330459]: 2026-02-01 10:06:55.722954034 +0000 UTC m=+0.085266962 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Feb 1 05:06:55 localhost podman[330459]: 2026-02-01 10:06:55.801428807 +0000 UTC m=+0.163741635 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 1 05:06:55 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:06:56 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 1 05:06:56 localhost ceph-mon[286721]: log_channel(audit) log [INF] : from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:06:57 localhost ovn_controller[152492]: 2026-02-01T10:06:57Z|00519|binding|INFO|Releasing lport a7b8fbb1-c1ab-4da8-8083-1a117cbed9e5 from this chassis (sb_readonly=0) Feb 1 05:06:57 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:06:57 localhost nova_compute[274651]: 2026-02-01 10:06:57.209 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:06:57 localhost dnsmasq[330203]: exiting on receipt of SIGTERM Feb 1 05:06:57 localhost podman[330501]: 2026-02-01 10:06:57.61428768 +0000 UTC m=+0.060907193 container kill 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4) Feb 1 05:06:57 localhost systemd[1]: libpod-319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921.scope: Deactivated successfully. Feb 1 05:06:57 localhost podman[330514]: 2026-02-01 10:06:57.688702848 +0000 UTC m=+0.056268571 container died 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 1 05:06:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921-userdata-shm.mount: Deactivated successfully. Feb 1 05:06:57 localhost podman[330514]: 2026-02-01 10:06:57.724174709 +0000 UTC m=+0.091740382 container cleanup 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true) Feb 1 05:06:57 localhost systemd[1]: libpod-conmon-319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921.scope: Deactivated successfully. Feb 1 05:06:57 localhost podman[330515]: 2026-02-01 10:06:57.764336604 +0000 UTC m=+0.128220974 container remove 319b4ffc686ffbe3889ce3491406bd62c9416d218b698612592a255dfec42921 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-801561a0-2ed8-4ee4-b5ee-fb979db78a17, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0) Feb 1 05:06:57 localhost ceph-mon[286721]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' Feb 1 05:06:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:58.230 259320 INFO neutron.agent.dhcp.agent [None req-84526b9c-6217-4f26-b793-2ca73049bcfa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:06:58 localhost nova_compute[274651]: 2026-02-01 10:06:58.289 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:06:58 localhost neutron_dhcp_agent[259316]: 2026-02-01 10:06:58.364 259320 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 1 05:06:58 localhost systemd[1]: var-lib-containers-storage-overlay-ca79e3793a4204502eac9e5aa23e21cfa1a1fe630e42b67c3b544a57b72ebd56-merged.mount: Deactivated successfully. Feb 1 05:06:58 localhost systemd[1]: run-netns-qdhcp\x2d801561a0\x2d2ed8\x2d4ee4\x2db5ee\x2dfb979db78a17.mount: Deactivated successfully. Feb 1 05:06:58 localhost nova_compute[274651]: 2026-02-01 10:06:58.993 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:00 localhost nova_compute[274651]: 2026-02-01 10:07:00.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:00 localhost nova_compute[274651]: 2026-02-01 10:07:00.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 1 05:07:00 localhost nova_compute[274651]: 2026-02-01 10:07:00.271 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 1 05:07:01 localhost nova_compute[274651]: 2026-02-01 10:07:01.365 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 1 05:07:01 localhost nova_compute[274651]: 2026-02-01 10:07:01.365 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquired lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 1 05:07:01 localhost nova_compute[274651]: 2026-02-01 10:07:01.366 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 1 05:07:01 localhost nova_compute[274651]: 2026-02-01 10:07:01.366 274655 DEBUG nova.objects.instance [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lazy-loading 'info_cache' on Instance uuid 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 1 05:07:01 localhost openstack_network_exporter[239441]: ERROR 10:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:07:01 localhost openstack_network_exporter[239441]: Feb 1 05:07:01 localhost openstack_network_exporter[239441]: ERROR 10:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:07:01 localhost openstack_network_exporter[239441]: Feb 1 05:07:02 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:02 localhost nova_compute[274651]: 2026-02-01 10:07:02.301 274655 DEBUG nova.network.neutron [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updating instance_info_cache with network_info: [{"id": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "address": "fa:16:3e:86:11:63", "network": {"id": "8bdf8183-8467-40ac-933d-a37b0bd3539a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "79df39cba1c14309b68e8b61518619fd", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap09cac1be-46", "ovs_interfaceid": "09cac1be-46e2-4a31-8306-e6f4f0401b19", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 1 05:07:02 localhost nova_compute[274651]: 2026-02-01 10:07:02.318 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Releasing lock "refresh_cache-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 1 05:07:02 localhost nova_compute[274651]: 2026-02-01 10:07:02.318 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] [instance: 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 1 05:07:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:07:02 localhost podman[330544]: 2026-02-01 10:07:02.719316072 +0000 UTC m=+0.073907664 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, release=1769056855, name=ubi9/ubi-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9) Feb 1 05:07:02 localhost podman[330544]: 2026-02-01 10:07:02.734370485 +0000 UTC m=+0.088962077 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1769056855, vcs-type=git, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 1 05:07:02 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.531 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'name': 'test', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005604212.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79df39cba1c14309b68e8b61518619fd', 'user_id': '7567a560936c417c92d242d856b00bb3', 'hostId': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.533 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.533 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.538 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fd8587c-a638-4447-9a9f-45112be8cc64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.533458', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1b1063a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': 'aba082bc9b85f5aa2fdd1e66f2f5fad74727151369dce080993594e4e9918bd7'}]}, 'timestamp': '2026-02-01 10:07:03.539571', '_unique_id': '76f50d08faf643c6a0d531558c61af5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.540 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.542 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.542 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets volume: 61 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f86b3fd-66e4-46dc-81ba-8cd1fb5f4184', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 61, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.542491', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1b18e16-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': 'a7e570ed6134a3bc4f11b736aef790de907657ac79f7d3a756cb295b6d9085d2'}]}, 'timestamp': '2026-02-01 10:07:03.543021', '_unique_id': '129460fed28b4f9bb7b8206bb0e67b9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.543 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.545 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.545 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.572 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.573 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd80dd8b3-a7c9-4250-94f6-0d99e1d450da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.545449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1b63c0e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '1cfff9ead7f41d51deb2f0de6a0e54a459eeb29f0162c9523a5bee56a5ae92d8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.545449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1b650cc-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': 'd66203a467ed6ab53d02dac0de055a05c7cd9e0961e568090d69e0da434edb61'}]}, 'timestamp': '2026-02-01 10:07:03.574228', '_unique_id': 'db6944e715b14826a456740ac38c9e01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.575 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.576 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.577 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 1100747130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.577 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.latency volume: 22673432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '379355e2-68be-439b-be3e-5fa15f177cda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1100747130, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.576937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1b6d11e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '35d9ee2aad1eee8c255ffc8f926eece8f8c41677b9a564c64c6f228dbe6dcd92'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22673432, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.576937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1b6e316-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': 'f96be12864494d090a891d84db9ea64c6dd20e37766917cc78ba205fbe32466d'}]}, 'timestamp': '2026-02-01 10:07:03.577923', '_unique_id': 'c0b024f1e420413aa57200ebcd3ec4fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.578 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.580 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.580 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '935fbc54-c37b-4609-bfff-c772fa81dfdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.580295', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1b753d2-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': '35abaad3db3b224ad26bf80d90af7746f87dbe47b32b5b00436d4be616a5602f'}]}, 'timestamp': '2026-02-01 10:07:03.580813', '_unique_id': 'b245026cecca487fb1017be963f3adb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.581 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.582 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.583 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23684f3a-ee16-4281-a014-a4bd8e413d83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.583108', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1b7c7ae-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': 'ca1fdb19f27454c6d3ba20864e8b384ff79728607362f8f53bcff8e0677496a9'}]}, 'timestamp': '2026-02-01 10:07:03.583778', '_unique_id': 'fbc64dd9cfe34fffba17563d2fa3344e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.584 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.585 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.597 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.598 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11aa9ca4-502c-45cb-a7f7-25066a09503f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.586048', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1b9f34e-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.805510193, 'message_signature': 'e4215bf128872d5116f93f1ab3e05db37290eba1d1bc2be60078d80c172d34ff'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.586048', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1ba0ae6-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.805510193, 'message_signature': '0798edf7d3b60aed43b6895fb1f6b1a04d134b06663525f828558c760c4f0030'}]}, 'timestamp': '2026-02-01 10:07:03.598614', '_unique_id': '3ae4d1fb86da44dbb40d0b6a07185d22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.599 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.601 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.601 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66fbdb69-a6fb-4cfe-bca5-3668465ce6e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.601495', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1ba8eee-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': '2f9017ca5abc68d40d101cc324142e241385009f8de810f58dabbdf6ae56a5fa'}]}, 'timestamp': '2026-02-01 10:07:03.602044', '_unique_id': '39d8c65d40a240e08236c0124492a5a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.602 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.604 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.604 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.604 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b111185-96c4-4c7e-a037-38367138e1d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.604278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1bafb36-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.805510193, 'message_signature': '9238d22510adda3a723ac6bab82b75f7784f505ed8ebc48864562fe21792e635'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.604278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1bb0c02-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.805510193, 'message_signature': 'd1c8bc2b744e0d9fd1edd9f378724babf6f1e0f0bbafe3897650fe3cd2ace041'}]}, 'timestamp': '2026-02-01 10:07:03.605188', '_unique_id': '36885f7ab3cc4ad7a925a0597c3c08a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.606 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.607 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bbdc99b-415d-42e2-a2ba-3b1b26425fd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.607414', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1bb75ca-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': '830323b4f870e1d0f3622514a7cb17cc73ef8e3f2bb9a1bed47a8d28b4df4da8'}]}, 'timestamp': '2026-02-01 10:07:03.607891', '_unique_id': '21e75deb2d3f45cfb9c4811e386a15fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.608 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.610 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72b9dc9c-7f9d-4b47-bdbc-4eaf7645b8e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.610168', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1bbe1ea-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': 'e3b0e57fc9ae271eba4c57b1c1c0b64d53b702f7d8becc21cdaaeab0d6b21bac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.610168', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1bbf2ca-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '0fbf0b3d8949ee9297f2edf8972dcaea1f9b8023a5d0b94ec953254254f9ddb3'}]}, 'timestamp': '2026-02-01 10:07:03.611100', '_unique_id': '0be0607d41b347918cce4533ad6f001f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.612 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.613 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.613 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5ccabb3-5e0c-4e2e-9b84-36e8cfeede8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.613274', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1bc5bc0-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': 'c9c94d6b88bcd50b0bba0a38504ae3e2024648422539b6d62c71d8a0e0921a34'}]}, 'timestamp': '2026-02-01 10:07:03.613781', '_unique_id': 'b7d19761e2c34909b77deac162d3a204'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.614 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.615 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.628 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/cpu volume: 19380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2f2e012-7cdf-4648-9bd7-687a7469304f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19380000000, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:07:03.615384', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'c1bebe1a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.848270847, 'message_signature': 'a3dbd77cec61883b5f063ef869a02b2419e42a39c7a012dc0ece88481ff25a78'}]}, 'timestamp': '2026-02-01 10:07:03.629328', '_unique_id': '78c762be71bf4e3480e98498a0b9d996'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.630 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.631 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46dcdb59-f9aa-415e-865d-7f557cf0599d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.630963', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1bf0b68-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': '972c2607ed43eb94613f47c9d92a27bf58dabafe1a050047bb1cdccfc307f3fd'}]}, 'timestamp': '2026-02-01 10:07:03.631297', '_unique_id': '81963481dde24062b19386e7a1f928ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.632 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 1484399740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.633 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.read.latency volume: 80474442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b5920f5-b46f-49db-85cc-245b4a9d07b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1484399740, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.632834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1bf535c-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '0e624fff0bd67e709e85ab3a23d8f9375ce2414ffe9aa8fd16b7e1a3652c01cc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80474442, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.632834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1bf5eec-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '6f28acd0fec3d910e36f53a2ad2f2f80b6fa7c084ca79dacfca026ba53983452'}]}, 'timestamp': '2026-02-01 10:07:03.633418', '_unique_id': '1922966c799245e4af336a9ae2e11cb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.634 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.635 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9e66d50-d929-45b5-8245-bf6c11be20fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.634910', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1bfa596-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.805510193, 'message_signature': 'fc3f861dd951ca8633f48c121fb94e05bf4dca06566702650f58f109e4b67cb9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.634910', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1bfb04a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.805510193, 'message_signature': 'f53790eee36e6a4863b276d9b0a10f5251d1edaa7c015898caca54b3a31c88d8'}]}, 'timestamp': '2026-02-01 10:07:03.635495', '_unique_id': '84905cb92baa4c96904cc7936f37ab39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.636 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.637 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e2922ab-38f5-4d78-83fb-19ae9df8685f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.636922', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1bff46a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '0a36194afac53dcfde6e6e5b54deff865465aee08ea001a6eb420dc6e484d796'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.636922', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1bfff0a-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '525c9c4fbbcb76c11991b1c8ba5104d03b967e3f044604cdeb896b42e2769a44'}]}, 'timestamp': '2026-02-01 10:07:03.637512', '_unique_id': 'd11ba60fd05e4891a41d14a6ecb96d15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.638 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.639 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04512015-4fc7-490e-a4fc-54410975537c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vda', 'timestamp': '2026-02-01T10:07:03.638917', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'c1c04280-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': '1a4f8aa01bd59b4014ee1abb2e2b083c7b5b8be83fd7bd0c946700d04434377b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-vdb', 'timestamp': '2026-02-01T10:07:03.638917', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'c1c04d52-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.764891994, 'message_signature': 'd741a1ebd007455350cb8669237f765f8e99597ce7f3524e4c28fa14165c2815'}]}, 'timestamp': '2026-02-01 10:07:03.639514', '_unique_id': '40fd91598d9d49ef8ded8a96d0a5f8e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.640 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c788b5ee-cc0f-4518-9b96-c782a20ff742', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.640947', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1c09366-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': '988517cb5e50694aaf138c46e4a4cb93cd5a7cd00d4b1cb25e5f1f29cd0b31ea'}]}, 'timestamp': '2026-02-01 10:07:03.641329', '_unique_id': 'eb2f8ded26da4ac7b6c103acfe64854c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.641 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.642 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.642 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/memory.usage volume: 51.63671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd19a875-b975-4a54-834a-13ae88ad5369', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.63671875, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'timestamp': '2026-02-01T10:07:03.642699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'c1c0d4a2-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.848270847, 'message_signature': 'edfa229048205e47cb4b491345beb34d3815c93e72991e3936dbbc4af96d1b5f'}]}, 'timestamp': '2026-02-01 10:07:03.643007', '_unique_id': '1efca9f5c88d45f19b9942ef7eca6925'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.643 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.644 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.644 12 DEBUG ceilometer.compute.pollsters [-] 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02/network.incoming.bytes volume: 6874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00bfd9d2-e161-45cd-96bd-e5049069c59f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6874, 'user_id': '7567a560936c417c92d242d856b00bb3', 'user_name': None, 'project_id': '79df39cba1c14309b68e8b61518619fd', 'project_name': None, 'resource_id': 'instance-00000002-08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02-tap09cac1be-46', 'timestamp': '2026-02-01T10:07:03.644373', 'resource_metadata': {'display_name': 'test', 'name': 'tap09cac1be-46', 'instance_id': '08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02', 'instance_type': 'm1.small', 'host': '608dd73be4a78a836f9e08bf46a7facc0988ad8f1480ed2b9493761f', 'instance_host': 'np0005604212.localdomain', 'flavor': {'id': '371ff7cc-43c7-4354-b1ce-55c23740c8c8', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '9ad21908-e58f-4439-b6a2-d7c4bf075554'}, 'image_ref': '9ad21908-e58f-4439-b6a2-d7c4bf075554', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:86:11:63', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap09cac1be-46'}, 'message_id': 'c1c11606-ff55-11f0-ae11-fa163ee4dff1', 'monotonic_time': 12517.752895475, 'message_signature': 'cc83383a4e9100d38059f658da57704b9efae9236447972415b89bd9fbccc9e6'}]}, 'timestamp': '2026-02-01 10:07:03.644672', '_unique_id': '531aad0add664b68ac0c6c4bb6453a73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging yield Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 1 05:07:03 localhost ceilometer_agent_compute[232240]: 2026-02-01 10:07:03.645 12 ERROR oslo_messaging.notify.messaging Feb 1 05:07:03 localhost nova_compute[274651]: 2026-02-01 10:07:03.996 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:03 localhost nova_compute[274651]: 2026-02-01 10:07:03.998 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:04 localhost nova_compute[274651]: 2026-02-01 10:07:03.999 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:07:04 localhost nova_compute[274651]: 2026-02-01 10:07:03.999 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:04 localhost nova_compute[274651]: 2026-02-01 10:07:04.036 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:04 localhost nova_compute[274651]: 2026-02-01 10:07:04.037 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:05 localhost nova_compute[274651]: 2026-02-01 10:07:05.313 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:06 localhost nova_compute[274651]: 2026-02-01 10:07:06.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:06 localhost nova_compute[274651]: 2026-02-01 10:07:06.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:06 localhost nova_compute[274651]: 2026-02-01 10:07:06.270 274655 DEBUG nova.compute.manager [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 1 05:07:07 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:07 localhost nova_compute[274651]: 2026-02-01 10:07:07.270 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:07 localhost nova_compute[274651]: 2026-02-01 10:07:07.271 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:09 localhost nova_compute[274651]: 2026-02-01 10:07:09.037 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:09 localhost nova_compute[274651]: 2026-02-01 10:07:09.039 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:09 localhost nova_compute[274651]: 2026-02-01 10:07:09.040 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:07:09 localhost nova_compute[274651]: 2026-02-01 10:07:09.040 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:09 localhost nova_compute[274651]: 2026-02-01 10:07:09.075 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:09 localhost nova_compute[274651]: 2026-02-01 10:07:09.076 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.269 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.290 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.290 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.291 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.291 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Auditing locally available compute resources for np0005604212.localdomain (node: np0005604212.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.292 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:07:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691. Feb 1 05:07:10 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:07:10 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3664023745' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.718 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:07:10 localhost systemd[1]: tmp-crun.l4f1eG.mount: Deactivated successfully. Feb 1 05:07:10 localhost podman[330584]: 2026-02-01 10:07:10.741258595 +0000 UTC m=+0.096713895 container health_status 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS) Feb 1 05:07:10 localhost podman[330584]: 2026-02-01 10:07:10.779044387 +0000 UTC m=+0.134499697 container exec_died 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 1 05:07:10 localhost systemd[1]: 2bc37f6a3f95f6751185363b720798cd15f96e1a9b7eaa030c20a3dfab5e5691.service: Deactivated successfully. Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.816 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:07:10 localhost nova_compute[274651]: 2026-02-01 10:07:10.816 274655 DEBUG nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.043 274655 WARNING nova.virt.libvirt.driver [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.045 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Hypervisor/Node resource view: name=np0005604212.localdomain free_ram=11083MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.045 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.046 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.144 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Instance 08d9c4c9-dc9f-45ed-a3c1-a84bd59f6c02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.145 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.145 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Final resource view: name=np0005604212.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.197 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 1 05:07:11 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 1 05:07:11 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1623814643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.657 274655 DEBUG oslo_concurrency.processutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.665 274655 DEBUG nova.compute.provider_tree [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed in ProviderTree for provider: a04bda90-8ccd-4104-8518-038544ff1327 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.686 274655 DEBUG nova.scheduler.client.report [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Inventory has not changed for provider a04bda90-8ccd-4104-8518-038544ff1327 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.689 274655 DEBUG nova.compute.resource_tracker [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Compute_service record updated for np0005604212.localdomain:np0005604212.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 1 05:07:11 localhost nova_compute[274651]: 2026-02-01 10:07:11.689 274655 DEBUG oslo_concurrency.lockutils [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 1 05:07:12 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:12 localhost nova_compute[274651]: 2026-02-01 10:07:12.690 274655 DEBUG oslo_service.periodic_task [None req-fc03cb38-8d28-4088-9ff9-4be190e971bb - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 1 05:07:14 localhost nova_compute[274651]: 2026-02-01 10:07:14.077 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:14 localhost nova_compute[274651]: 2026-02-01 10:07:14.079 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:14 localhost nova_compute[274651]: 2026-02-01 10:07:14.079 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:07:14 localhost nova_compute[274651]: 2026-02-01 10:07:14.080 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:14 localhost nova_compute[274651]: 2026-02-01 10:07:14.113 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:14 localhost nova_compute[274651]: 2026-02-01 10:07:14.114 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d. Feb 1 05:07:16 localhost systemd[1]: tmp-crun.S2H9aH.mount: Deactivated successfully. Feb 1 05:07:16 localhost podman[330627]: 2026-02-01 10:07:16.727635537 +0000 UTC m=+0.087173281 container health_status 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 1 05:07:16 localhost podman[330627]: 2026-02-01 10:07:16.74039037 +0000 UTC m=+0.099928104 container exec_died 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 1 05:07:16 localhost systemd[1]: 06331210aa41cd1dd6250d26f73f828ba92f51c375a199bd424a724743a18e7d.service: Deactivated successfully. Feb 1 05:07:17 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:19 localhost nova_compute[274651]: 2026-02-01 10:07:19.115 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:19 localhost nova_compute[274651]: 2026-02-01 10:07:19.117 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:19 localhost nova_compute[274651]: 2026-02-01 10:07:19.117 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:07:19 localhost nova_compute[274651]: 2026-02-01 10:07:19.117 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:19 localhost nova_compute[274651]: 2026-02-01 10:07:19.158 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:19 localhost nova_compute[274651]: 2026-02-01 10:07:19.160 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0. Feb 1 05:07:20 localhost systemd[1]: tmp-crun.GfCp9D.mount: Deactivated successfully. Feb 1 05:07:20 localhost podman[330650]: 2026-02-01 10:07:20.738247208 +0000 UTC m=+0.094148576 container health_status 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127) Feb 1 05:07:20 localhost podman[330650]: 2026-02-01 10:07:20.774525483 +0000 UTC m=+0.130426881 container exec_died 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 1 05:07:20 localhost systemd[1]: 728b13665893809f2358d0b8f2b40a3f4e12caef509c54570a74a3628b5aaab0.service: Deactivated successfully. Feb 1 05:07:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3. Feb 1 05:07:21 localhost podman[330668]: 2026-02-01 10:07:21.727893498 +0000 UTC m=+0.088069989 container health_status 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 1 05:07:21 localhost podman[330668]: 2026-02-01 10:07:21.740851477 +0000 UTC m=+0.101027978 container exec_died 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 1 05:07:21 localhost systemd[1]: 385648ad0ece74ed3411d48901803b1a9c346facd11488c7b8c886bba41ff9c3.service: Deactivated successfully. Feb 1 05:07:22 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:22 localhost sshd[330691]: main: sshd: ssh-rsa algorithm is disabled Feb 1 05:07:22 localhost systemd-logind[759]: New session 76 of user zuul. Feb 1 05:07:22 localhost systemd[1]: Started Session 76 of User zuul. Feb 1 05:07:22 localhost python3[330713]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-3279-acd3-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 1 05:07:23 localhost podman[236886]: time="2026-02-01T10:07:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 1 05:07:23 localhost podman[236886]: @ - - [01/Feb/2026:10:07:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156537 "" "Go-http-client/1.1" Feb 1 05:07:24 localhost podman[236886]: @ - - [01/Feb/2026:10:07:23 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18844 "" "Go-http-client/1.1" Feb 1 05:07:24 localhost nova_compute[274651]: 2026-02-01 10:07:24.160 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:24 localhost nova_compute[274651]: 2026-02-01 10:07:24.165 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb. Feb 1 05:07:26 localhost systemd[1]: tmp-crun.WcfhEI.mount: Deactivated successfully. Feb 1 05:07:26 localhost podman[330716]: 2026-02-01 10:07:26.726631951 +0000 UTC m=+0.086335866 container health_status b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller) Feb 1 05:07:26 localhost podman[330716]: 2026-02-01 10:07:26.769494149 +0000 UTC m=+0.129198084 container exec_died b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3) Feb 1 05:07:26 localhost systemd[1]: b130bd144cc2e3b6f79ef75426bade859c989d5815f7b5878da8cd15523266bb.service: Deactivated successfully. Feb 1 05:07:27 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:28 localhost systemd[1]: session-76.scope: Deactivated successfully. Feb 1 05:07:28 localhost systemd-logind[759]: Session 76 logged out. Waiting for processes to exit. Feb 1 05:07:28 localhost systemd-logind[759]: Removed session 76. Feb 1 05:07:28 localhost ovn_controller[152492]: 2026-02-01T10:07:28Z|00520|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Feb 1 05:07:29 localhost nova_compute[274651]: 2026-02-01 10:07:29.164 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:29 localhost nova_compute[274651]: 2026-02-01 10:07:29.166 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:29 localhost nova_compute[274651]: 2026-02-01 10:07:29.166 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:07:29 localhost nova_compute[274651]: 2026-02-01 10:07:29.167 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:29 localhost nova_compute[274651]: 2026-02-01 10:07:29.198 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:29 localhost nova_compute[274651]: 2026-02-01 10:07:29.198 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:31 localhost openstack_network_exporter[239441]: ERROR 10:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 1 05:07:31 localhost openstack_network_exporter[239441]: Feb 1 05:07:31 localhost openstack_network_exporter[239441]: ERROR 10:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 1 05:07:31 localhost openstack_network_exporter[239441]: Feb 1 05:07:32 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb. Feb 1 05:07:33 localhost systemd[1]: tmp-crun.iUn67m.mount: Deactivated successfully. Feb 1 05:07:33 localhost podman[330743]: 2026-02-01 10:07:33.732082298 +0000 UTC m=+0.093098704 container health_status ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 1 05:07:33 localhost podman[330743]: 2026-02-01 10:07:33.772258544 +0000 UTC m=+0.133275030 container exec_died ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '89edbaf987c24b00bf62461b3a5b7896cb36c100fc847157298a10b6c20dacb6'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, release=1769056855, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.) Feb 1 05:07:33 localhost systemd[1]: ceac27d20604814f4e07b90d43adf2c87126d483868a8742c2bf44737e2bcfcb.service: Deactivated successfully. Feb 1 05:07:34 localhost nova_compute[274651]: 2026-02-01 10:07:34.199 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:34 localhost nova_compute[274651]: 2026-02-01 10:07:34.201 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:34 localhost nova_compute[274651]: 2026-02-01 10:07:34.201 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:07:34 localhost nova_compute[274651]: 2026-02-01 10:07:34.202 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:34 localhost nova_compute[274651]: 2026-02-01 10:07:34.248 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:34 localhost nova_compute[274651]: 2026-02-01 10:07:34.249 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 1 05:07:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/246910684' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 1 05:07:34 localhost ceph-mon[286721]: mon.np0005604212@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 1 05:07:34 localhost ceph-mon[286721]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/246910684' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 1 05:07:37 localhost ceph-mon[286721]: mon.np0005604212@0(leader).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 1 05:07:39 localhost nova_compute[274651]: 2026-02-01 10:07:39.250 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:39 localhost nova_compute[274651]: 2026-02-01 10:07:39.252 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 1 05:07:39 localhost nova_compute[274651]: 2026-02-01 10:07:39.253 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 1 05:07:39 localhost nova_compute[274651]: 2026-02-01 10:07:39.253 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:39 localhost nova_compute[274651]: 2026-02-01 10:07:39.281 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 1 05:07:39 localhost nova_compute[274651]: 2026-02-01 10:07:39.282 274655 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 1 05:07:40 localhost sshd[330763]: main: sshd: ssh-rsa algorithm is disabled Feb 1 05:07:40 localhost systemd-logind[759]: New session 77 of user zuul. Feb 1 05:07:40 localhost systemd[1]: Started Session 77 of User zuul.